2019-06-21 16:05:06 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 16:05:06 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 16:05:06 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 16:05:06 [scrapy.extensions.telnet] INFO: Telnet Password: 29ed4ce8f69e2738 2019-06-21 16:05:06 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 16:05:07 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 16:05:07 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 16:05:07 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 16:05:07 [scrapy.core.engine] INFO: Spider opened 2019-06-21 16:05:07 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 16:05:07 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): TCP connection timed out: 10060: 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。. 2019-06-21 16:05:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): TCP connection timed out: 10060: 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。. 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:49 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 16:05:49 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 16:05:49 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 16:05:49 [scrapy.extensions.telnet] INFO: Telnet Password: 6f8625c24de8e737 2019-06-21 16:05:49 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 16:05:49 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 16:05:49 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 16:05:49 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 16:05:49 [scrapy.core.engine] INFO: Spider opened 2019-06-21 16:05:49 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 16:05:49 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 16:05:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:05:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:49 [scrapy.extensions.logstats] INFO: Crawled 539 pages (at 539 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 16:06:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:06:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 16:07:38 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 16:07:38 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 500479, 'downloader/request_count': 1007, 'downloader/request_method_count/POST': 1007, 'downloader/response_bytes': 307135, 'downloader/response_count': 1007, 'downloader/response_status_count/200': 1007, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 8, 7, 38, 811979), 'log_count/DEBUG': 1007, 'log_count/ERROR': 1007, 'log_count/INFO': 10, 'log_count/WARNING': 1007, 'response_received_count': 1007, 'scheduler/dequeued': 1007, 'scheduler/dequeued/memory': 1007, 'scheduler/enqueued': 1007, 'scheduler/enqueued/memory': 1007, 'spider_exceptions/FileNotFoundError': 1007, 'start_time': datetime.datetime(2019, 6, 21, 8, 5, 49, 895176)} 2019-06-21 16:07:38 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 17:35:26 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:35:26 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:35:26 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:35:26 [scrapy.extensions.telnet] INFO: Telnet Password: f40d7b207f7c4deb 2019-06-21 17:35:26 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:35:26 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:35:26 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:35:26 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 17:35:26 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:35:26 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:35:26 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:35:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:35:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): TCP connection timed out: 10060: 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。. 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.extensions.logstats] INFO: Crawled 614 pages (at 614 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:36:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:18 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): TCP connection timed out: 10060: 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。. 2019-06-21 17:37:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:37:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:37:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:37:19 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 17:37:19 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 2, 'downloader/exception_type_count/twisted.internet.error.TCPTimedOutError': 2, 'downloader/request_bytes': 555149, 'downloader/request_count': 1117, 'downloader/request_method_count/POST': 1117, 'downloader/response_bytes': 340075, 'downloader/response_count': 1115, 'downloader/response_status_count/200': 1115, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 9, 37, 19, 382358), 'log_count/DEBUG': 1117, 'log_count/ERROR': 1115, 'log_count/INFO': 10, 'log_count/WARNING': 1115, 'response_received_count': 1115, 'retry/count': 2, 'retry/reason_count/twisted.internet.error.TCPTimedOutError': 2, 'scheduler/dequeued': 1117, 'scheduler/dequeued/memory': 1117, 'scheduler/enqueued': 1117, 'scheduler/enqueued/memory': 1117, 'spider_exceptions/FileNotFoundError': 1115, 'start_time': datetime.datetime(2019, 6, 21, 9, 35, 26, 523593)} 2019-06-21 17:37:19 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 17:38:15 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:38:15 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:38:15 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:38:15 [scrapy.extensions.telnet] INFO: Telnet Password: ca0d86b512779649 2019-06-21 17:38:15 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:38:15 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:38:15 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:38:15 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 17:38:15 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:38:15 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:38:15 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:38:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:38:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:38:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:03 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:39:03 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:39:03 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:39:03 [scrapy.extensions.telnet] INFO: Telnet Password: 4efde09c196f9ecb 2019-06-21 17:39:03 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:39:03 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:39:03 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:39:03 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 17:39:03 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:39:03 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:39:03 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:39:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:03 [scrapy.extensions.logstats] INFO: Crawled 674 pages (at 674 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:40:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:13 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:40:13 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:40:13 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:40:13 [scrapy.extensions.telnet] INFO: Telnet Password: 3bc7b175089e7dd5 2019-06-21 17:40:13 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:40:13 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:40:13 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:40:13 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 17:40:13 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:40:13 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:40:13 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:36 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:36 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:37 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:37 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:38 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:38 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:39 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:58 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:58 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:40:59 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): TCP connection timed out: 10060: 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。. 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:00 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:01 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:02 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:03 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:04 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:05 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:06 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:07 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:08 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:09 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:10 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:11 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:12 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:12 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.extensions.logstats] INFO: Crawled 656 pages (at 656 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:13 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:13 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:14 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:14 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:16 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:16 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:17 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:18 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:19 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:20 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:21 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:21 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:22 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:23 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:24 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:24 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:26 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:27 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:28 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:28 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:29 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:29 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:30 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:30 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:31 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:32 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:33 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:34 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:35 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:39 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:41:39 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:41:39 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:41:39 [scrapy.extensions.telnet] INFO: Telnet Password: 1c7d05e43a613ce0 2019-06-21 17:41:39 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:41:39 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:41:39 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:41:39 [scrapy.middleware] INFO: Enabled item pipelines: [] 2019-06-21 17:41:39 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:41:39 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:41:39 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:41:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:39 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:39 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:40 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:41 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:41 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:42 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:43 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:44 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:44 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:45 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:45 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:46 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:46 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:47 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:47 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:49 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:50 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:50 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:51 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:52 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:53 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:54 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:54 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:55 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:55 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:56 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:56 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.engine] DEBUG: Crawled (200) (referer: None) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "205.201.4.177" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:41:57 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sports.py", line 96, in parse with open('../collectSports/conf/hg0088.json', 'r', encoding='utf8') as hg: FileNotFoundError: [Errno 2] No such file or directory: '../collectSports/conf/hg0088.json' 2019-06-21 17:42:01 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 17:42:01 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 17:42:01 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 17:42:01 [scrapy.extensions.telnet] INFO: Telnet Password: f1e57bb7e23c5e2d 2019-06-21 17:42:01 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 17:42:01 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 17:42:01 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 17:42:01 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 17:42:01 [scrapy.core.engine] INFO: Spider opened 2019-06-21 17:42:01 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 17:42:01 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 17:42:08 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:01:48 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:01:48 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:01:48 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:01:48 [scrapy.extensions.telnet] INFO: Telnet Password: 5c3b7ca1ccbfc80e 2019-06-21 18:01:48 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:01:48 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:01:48 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:01:48 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:01:48 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:01:48 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:01:48 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:09:38 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:09:38 [scrapy.utils.signal] ERROR: Error caught on signal handler: > Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\signal.py", line 30, in send_catch_log *arguments, **named) File "C:\ProgramData\Anaconda3\lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 343, in request_scheduled redirected_urls = request.meta.get('redirect_urls', []) AttributeError: 'NoneType' object has no attribute 'meta' 2019-06-21 18:09:38 [twisted] CRITICAL: Unhandled Error Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\commands\crawl.py", line 58, in run self.crawler_process.start() File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\crawler.py", line 293, in start reactor.run(installSignalHandlers=False) # blocking call File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\base.py", line 1272, in run self.mainLoop() File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\base.py", line 1281, in mainLoop self.runUntilCurrent() --- --- File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\base.py", line 902, in runUntilCurrent call.func(*call.args, **call.kw) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\reactor.py", line 41, in __call__ return self._func(*self._a, **self._kw) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\core\engine.py", line 135, in _next_request self.crawl(request, spider) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\core\engine.py", line 210, in crawl self.schedule(request, spider) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\core\engine.py", line 216, in schedule if not self.slot.scheduler.enqueue_request(request): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\core\scheduler.py", line 54, in enqueue_request if not request.dont_filter and self.df.request_seen(request): builtins.AttributeError: 'NoneType' object has no attribute 'dont_filter' 2019-06-21 18:09:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:09:48 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:10:25 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sportslst.py", line 58, in parse re = cbk(response.body) File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 20, in today re = self.mixbodyv(data, 'today', '/html/head/script[2]/text()') File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 35, in mixbodyv text = ls[0] IndexError: list index out of range 2019-06-21 18:10:25 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 18:10:25 [scrapy.core.engine] ERROR: Scraper close failure Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks current.result = callback(current.result, *args, **kw) File "D:\workspace\sports_collect\collectSports\pipelines\sportslst.py", line 167, in close_spider self.client.close() AttributeError: 'SportslstPipeline' object has no attribute 'client' 2019-06-21 18:10:25 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 443, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 306, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 10, 10, 25, 584325), 'log_count/CRITICAL': 1, 'log_count/ERROR': 3, 'log_count/INFO': 11, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'start_time': datetime.datetime(2019, 6, 21, 10, 1, 48, 315678)} 2019-06-21 18:10:25 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 18:10:47 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:10:47 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:10:47 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:10:47 [scrapy.extensions.telnet] INFO: Telnet Password: d76f2ac5ee5932c8 2019-06-21 18:10:47 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:10:47 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:10:47 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:10:47 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:10:47 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:10:47 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:10:47 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:10:48 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:11:16 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sportslst.py", line 58, in parse re = cbk(response.body) File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 20, in today re = self.mixbodyv(data, 'today', '/html/head/script[2]/text()') File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 35, in mixbodyv text = ls[0] IndexError: list index out of range 2019-06-21 18:11:16 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 18:11:16 [scrapy.core.engine] ERROR: Scraper close failure Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks current.result = callback(current.result, *args, **kw) File "D:\workspace\sports_collect\collectSports\pipelines\sportslst.py", line 167, in close_spider self.client.close() AttributeError: 'SportslstPipeline' object has no attribute 'client' 2019-06-21 18:11:16 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 443, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 306, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 10, 11, 16, 975094), 'log_count/ERROR': 2, 'log_count/INFO': 9, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'start_time': datetime.datetime(2019, 6, 21, 10, 10, 47, 964955)} 2019-06-21 18:11:16 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 18:11:25 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:11:25 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:11:25 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:11:25 [scrapy.extensions.telnet] INFO: Telnet Password: d206e1997ad4edf8 2019-06-21 18:11:25 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:11:25 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:11:25 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:11:26 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:11:26 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:11:26 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:11:26 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:11:26 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:11:26 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sportslst.py", line 58, in parse re = cbk(response.body) File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 20, in today re = self.mixbodyv(data, 'today', '/html/head/script[2]/text()') File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 35, in mixbodyv text = ls[0] IndexError: list index out of range 2019-06-21 18:11:26 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 18:11:26 [scrapy.core.engine] ERROR: Scraper close failure Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks current.result = callback(current.result, *args, **kw) File "D:\workspace\sports_collect\collectSports\pipelines\sportslst.py", line 167, in close_spider self.client.close() AttributeError: 'SportslstPipeline' object has no attribute 'client' 2019-06-21 18:11:26 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 443, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 306, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 10, 11, 26, 979346), 'log_count/ERROR': 2, 'log_count/INFO': 9, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'start_time': datetime.datetime(2019, 6, 21, 10, 11, 26, 96913)} 2019-06-21 18:11:26 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 18:11:34 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:11:34 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:11:34 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:11:34 [scrapy.extensions.telnet] INFO: Telnet Password: 303b52690db05771 2019-06-21 18:11:34 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:11:34 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:11:34 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:11:34 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:11:34 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:11:34 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:11:34 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:11:34 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:14:37 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:14:37 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:14:37 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:14:37 [scrapy.extensions.telnet] INFO: Telnet Password: 81e5aac18c6f394a 2019-06-21 18:14:37 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:14:37 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:14:37 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:14:37 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:14:37 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:14:37 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:14:37 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:15:05 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:15:05 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:15:05 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:15:06 [scrapy.extensions.telnet] INFO: Telnet Password: 2c42663128a099bc 2019-06-21 18:15:06 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:15:06 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:15:06 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:15:06 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:15:06 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:15:06 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:15:06 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:15:16 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:15:16 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:15:16 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:15:16 [scrapy.extensions.telnet] INFO: Telnet Password: c2e6d32c54066e82 2019-06-21 18:15:16 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:15:16 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:15:16 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:15:16 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:15:16 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:15:16 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:15:16 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:15:56 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:15:56 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:15:56 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:15:56 [scrapy.extensions.telnet] INFO: Telnet Password: 20efc0bf85b6ac8e 2019-06-21 18:15:56 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:15:56 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:15:56 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:15:57 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:15:57 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:15:57 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:15:57 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:16:15 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:18:20 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sportslst.py", line 58, in parse re = cbk(response.body) File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 20, in today re = self.mixbodyv(data, 'today', '/html/head/script[2]/text()') File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 35, in mixbodyv text = ls[0] IndexError: list index out of range 2019-06-21 18:18:20 [scrapy.extensions.logstats] INFO: Crawled 1 pages (at 1 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:18:20 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 18:18:20 [scrapy.core.engine] ERROR: Scraper close failure Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks current.result = callback(current.result, *args, **kw) File "D:\workspace\sports_collect\collectSports\pipelines\sportslst.py", line 167, in close_spider self.client.close() AttributeError: 'SportslstPipeline' object has no attribute 'client' 2019-06-21 18:18:20 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 442, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 306, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 10, 18, 20, 848215), 'log_count/ERROR': 2, 'log_count/INFO': 10, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'start_time': datetime.datetime(2019, 6, 21, 10, 15, 57, 38956)} 2019-06-21 18:18:20 [scrapy.core.engine] INFO: Spider closed (finished) 2019-06-21 18:18:29 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: SportsSpider) 2019-06-21 18:18:29 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.7.3 (default, Mar 27 2019, 17:13:21) [MSC v.1915 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1b 26 Feb 2019), cryptography 2.6.1, Platform Windows-10-10.0.17763-SP0 2019-06-21 18:18:29 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'SportsSpider', 'LOG_FILE': 'SportsSpiderlog20190621.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'collectSports.spiders', 'SPIDER_MODULES': ['collectSports.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 UBrowser/6.2.4098.3 Safari/537.36'} 2019-06-21 18:18:29 [scrapy.extensions.telnet] INFO: Telnet Password: 0c0b40253fe4adf1 2019-06-21 18:18:29 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2019-06-21 18:18:29 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2019-06-21 18:18:29 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2019-06-21 18:18:29 [scrapy.middleware] INFO: Enabled item pipelines: ['collectSports.pipelines.sportslst.SportslstPipeline'] 2019-06-21 18:18:29 [scrapy.core.engine] INFO: Spider opened 2019-06-21 18:18:29 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:18:29 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2019-06-21 18:18:33 [scrapy.core.downloader.tls] WARNING: Ignoring error while verifying certificate from host "199.26.100.178" (exception: ValueError('Invalid DNS-ID.')) 2019-06-21 18:39:02 [scrapy.core.scraper] ERROR: Spider error processing (referer: None) Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\utils\defer.py", line 102, in iter_errback yield next(it) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\offsite.py", line 29, in process_spider_output for x in result: File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\referer.py", line 339, in return (_set_referer(r) for r in result or ()) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\urllength.py", line 37, in return (r for r in result or () if _filter(r)) File "C:\ProgramData\Anaconda3\lib\site-packages\scrapy\spidermiddlewares\depth.py", line 58, in return (r for r in result or () if _filter(r)) File "D:\workspace\sports_collect\collectSports\spiders\sportslst.py", line 58, in parse re = cbk(response.body) File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 20, in today re = self.mixbodyv(data, 'today', '/html/head/script[2]/text()') File "D:\workspace\sports_collect\collectSports\mcollect\hg0088\Resolver.py", line 35, in mixbodyv text = ls[0] IndexError: list index out of range 2019-06-21 18:39:02 [scrapy.extensions.logstats] INFO: Crawled 1 pages (at 1 pages/min), scraped 0 items (at 0 items/min) 2019-06-21 18:39:02 [scrapy.core.engine] INFO: Closing spider (finished) 2019-06-21 18:39:02 [scrapy.core.engine] ERROR: Scraper close failure Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\twisted\internet\defer.py", line 654, in _runCallbacks current.result = callback(current.result, *args, **kw) File "D:\workspace\sports_collect\collectSports\pipelines\sportslst.py", line 167, in close_spider self.client.close() AttributeError: 'SportslstPipeline' object has no attribute 'client' 2019-06-21 18:39:03 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 442, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 306, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2019, 6, 21, 10, 39, 3, 7590), 'log_count/ERROR': 2, 'log_count/INFO': 10, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'spider_exceptions/IndexError': 1, 'start_time': datetime.datetime(2019, 6, 21, 10, 18, 29, 905287)} 2019-06-21 18:39:03 [scrapy.core.engine] INFO: Spider closed (finished)