首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >如何在关闭爬虫时退出selenium驱动

如何在关闭爬虫时退出selenium驱动
EN

Stack Overflow用户
提问于 2015-08-12 20:10:47
回答 2查看 874关注 0票数 1

我有一个爬虫,我必须使用Selenium来抓取页面上的动态数据。它看起来是这样的:

代码语言:javascript
复制
class MySpider(
    name = 'myspider'
    start_urls = ['http://example.org']

    def __init__(self, *args, **kwargs):
        super(, self).__init__(*args, **kwargs)
        self.driver = webdriver.Firefox()
        self.driver.implicitly_wait(5)
        dispatcher.connect(self.spider_closed, signals.spider_closed)

    def spider_closed(self, spider):
        if self.driver:
            self.driver.quit()
            self.driver = None

这里的问题是,当我在Scrapyd中取消作业时,它不会停止,直到我手动关闭窗口。当我将爬行器部署到真正的服务器上时,我显然无法做到这一点。

这是我每次点击“取消”时在Scrapyd日志中看到的内容:

代码语言:javascript
复制
2015-08-12 13:48:13+0300 [HTTPChannel,208,127.0.0.1] Unhandled Error
    Traceback (most recent call last):
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/web/http.py", line 1731, in allContentReceived
        req.requestReceived(command, path, version)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/web/http.py", line 827, in requestReceived
        self.process()
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/web/server.py", line 189, in process
        self.render(resrc)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/web/server.py", line 238, in render
        body = resrc.render(self)
    --- <exception caught here> ---
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/scrapyd/webservice.py", line 18, in render
        return JsonResource.render(self, txrequest)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/scrapy/utils/txweb.py", line 10, in render
        r = resource.Resource.render(self, txrequest)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/web/resource.py", line 250, in render
        return m(request)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/scrapyd/webservice.py", line 55, in render_POST
        s.transport.signalProcess(signal)
      File "/home/dmitry/.virtualenvs/myproject/local/lib/python2.7/site-packages/twisted/internet/process.py", line 339, in signalProcess
        raise ProcessExitedAlready()
    twisted.internet.error.ProcessExitedAlready: 

但该作业仍在作业列表中,并被标记为“正在运行”。那么我如何关闭驱动程序呢?

EN

回答 2

Stack Overflow用户

发布于 2015-08-14 00:36:31

导入SignalManager

代码语言:javascript
复制
from scrapy.signalmanager import SignalManager

然后替换:

代码语言:javascript
复制
dispatcher.connect(self.spider_closed, signals.spider_closed)

通过以下方式:

代码语言:javascript
复制
SignalManager(dispatcher.Any).connect(self.spider_closed, signal=signals.spider_closed)
票数 0
EN

Stack Overflow用户

发布于 2015-08-15 01:54:47

你有没有尝试过在爬行器上实现from_crawler?我只在管道和扩展上做到了这一点,但它应该对爬行器也是一样的。

代码语言:javascript
复制
@classmethod
def from_crawler(cls, crawler, *args, **kwargs):
    o = cls(*args, **kwargs)
    crawler.signals.connect(o.spider_closed, signal=signals.spider_closed)
    return o

http://doc.scrapy.org/en/latest/topics/spiders.html#scrapy.spiders.Spider.from_crawler

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/31964840

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档