Build crawler management platform

This is the 16th day of my participation in the August More Text Challenge. For details, see:August is more challenging

Learning Objectives:

  1. Crawler management platform understanding

    • scrapydweb
    • gerapy
    • crawlab
  2. Local construction of crawler management platform

Understanding of crawler management platform:

  1. Scrapydweb:

    Web applications for Scrapyd implementation management, support Scrapy log analysis and visualization making address: https://github.com/my8100/scrapydweb.gitCopy the code
  2. Gerapy:

    Distributed crawler management framework based on Scrapy, Scrapyd, scrapyd-client, scrapyd-API, Django and Vue. The project is making address: https://www.cnblogs.com/xbhog/p/13336651.html https://github.com/Gerapy/Gerapy.gitCopy the code
  3. Crawlab:

    Golang-based distributed crawler management platform, support a variety of programming languages and a variety of crawler framework. Document address: https://docs.crawlab.cn/zh/ making address: https://github.com/crawlab-team/crawlab.gitCopy the code

Note: the construction of the first two framework based on Scrapyd, if you don’t know how to configure written before can see my blog: www.cnblogs.com/xbhog/p/133…

Local construction of crawler management platform:

  1. Scrapydweb build:

    • Install: PIP install scrapydWeb -i pypi.doubanio.com/simple

    • So let’s start scrapyd on the command-line

    • Then enter scrapydweb

    • Interface effect:

    • There are many online tutorials for deployment and installation

  2. gerapy

    • Related configuration before my blog: www.cnblogs.com/xbhog/p/133…
  3. Crawlab :(configuration and installation official to give very detailed, here briefly say)

    • First, clone the code from the remote repository: git Clone address/copy address into PyCharm

    • # Personal choice Docker, the project configuration environment is too much, afraid to cause local conflictCopy the code
    • Docker installation:

      • The installation address: www.docker.com/products/do…
      • Installation environment: Enable local virtualization and Hyper-V, as shown in the following figure

    • Install it by default.

    • Related links in detail (novice tutorial: www.runoob.com/docker/wind.)

    • PIP install docker-compose: PIP install docker-compose

    • Test in the root directory:

      Docker normal is empty Name - compose ps Command State Ports -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --Copy the code
    • Install and start: docker-compose up -d

    • http://127.0.0.1:8080/#/login enter

The end:

If you see here or just to help you, I hope you can point to follow or recommend, thank you;

If there are any errors, please point them out in the comments and the author sees them will be corrected.