preface
This update includes several parts:
- Centralized Log Management
- Automatic installation dependency
- Open API
- Web Hook
- Automatically generate result sets
Update log
Features/optimizations
- Optimized log management: Centralized log management, stored in MongoDB, reducing the dependence on PubSub, allowing abnormal log detection.
- Automatic installation dependency. Allow from
requirements.txt
和package.json
Automatic installation dependencies. - API Tokens. Allows users to generate API tokens and leverage them to integrate into their own systems.
- Web Hook. Triggers a Web Hook HTTP request to a predefined URL when a task starts or ends.
- Automatically generate result sets. If this parameter is not set, the result set is automatically set to
results_<spider_name>
. - Optimize the Project list. “No Project” is not displayed in the Project list.
- Upgrade Node.js. Upgrade node.js from V8.12 to V10.19.
- Added a run button for scheduled tasks. Allows users to manually run crawler tasks on the scheduled task interface.
Bug fix
- Unable to register. #670
- Crawler timed task tag Cron expression displays seconds. #678
- Daily crawler data missing. #684
- Results number not immediately updated. #689
Product planning
- The results show
- Support for other databases
- The crawler
- Support for container crawlers
- Long quest crawler support
- Configurable crawler
- Configurable crawler supports Splash
- Configurable crawlers support crawlSpiders
- Configurable crawlers support regular expression fields
- Timing task
- The calendar shows
- The server
- Support terminal operation Docker image
- SDK
- More Command Support
- global
- Hot update
reference
- Making: github.com/crawlab-tea…
- Demo: crawlab.cn/demo
- Documents: the docs. Crawlab. Cn
community
If you think Crawlab is helpful to your daily development or company, please add the author on wechat tikazyQ1 and mark “Crawlab”, and the author will pull you into the group. Welcome to star on Github, and feel free to issue on Github if you have any problems. In addition, you are welcome to contribute to Crawlab development.