科技知识动态:强大的网络爬虫系统:pyspider

导读跟大家讲解下有关强大的网络爬虫系统:pyspider,相信小伙伴们对这个话题应该也很关注吧,现在就为小伙伴们说说强大的网络爬虫系统:pyspid

跟大家讲解下有关强大的网络爬虫系统:pyspider,相信小伙伴们对这个话题应该也很关注吧,现在就为小伙伴们说说强大的网络爬虫系统:pyspider,小编也收集到了有关强大的网络爬虫系统:pyspider的相关资料,希望大家看到了会喜欢。

PySpider:一个国人编写的强大的网络爬虫系统并带有强大的WebUI。采用Python语言编写,分布式架构,支持多种数据库后端,强大的WebUI支持脚本编辑器,任务监视器,项目管理器以及结果查看器。

1.搭建环境:

系统版本:Linux centos-linux.shared 3.10.0-123.el7.x86_64 #1 SMP Mon Jun 30 12:09:22 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux

python版本:Python 3.5.1

1.1.搭建python3环境:

本人在尝试过后选择集成环境Anaconda

1.1.1.编译

# 下载依赖yum install -y ncurses-devel openssl openssl-devel zlib-devel gcc make glibc-devel libffi-devel glibc-static glibc-utils sqlite-devel readline-devel tk-devel gdbm-devel db4-devel libpcap-devel xz-deve# 下载python版本wget https://www.python.org/ftp/python/3.5.1/Python-3.5.1.tgz# 或者使用国内源wget http://mirrors.sohu.com/python/3.5.1/Python-3.5.1.tgzmv Python-3.5.1.tgz /usr/local/src;cd /usr/local/src# 解压tar -zxf Python-3.5.1.tgz;cd Python-3.5.1# 编译安装./configure --prefix=/usr/local/python3.5 --enable-sharedmake && make install# 建立软链接ln -s /usr/local/python3.5/bin/python3 /usr/bin/python3echo "/usr/local/python3.5/lib" > /etc/ld.so.conf.d/python3.5.confldconfig# 验证python3python3# Python 3.5.1 (default, Oct 9 2016, 11:44:24)# [GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux# Type "help", "copyright", "credits" or "license" for more information.# >>># pip/usr/local/python3.5/bin/pip3 install --upgrade pipln -s /usr/local/python3.5/bin/pip /usr/bin/pip# 本人在安装时出现问题 将pip重装wget https://bootstrap.pypa.io/get-pip.py --no-check-certificatepython get-pip.py

1.1.2.集成环境anaconda

# 集成环境anaconda(推荐)wget https://repo.continuum.io/archive/Anaconda3-4.2.0-Linux-x86_64.sh# 直接安装即可./Anaconda3-4.2.0-Linux-x86_64.sh# 若出错,可能是解压失败yum install bzip2

1.2.安装mariaDB

# 安装yum -y install mariadb mariadb-server# 启动systemctl start mariadb# 设置为开机启动systemctl enable mariadb# 配置密码 默认为空mysql_secure_installation# 登录mysql -u root -p# 创建一个用户 自己设定账户密码CREATE USER 'user_name'@'localhost' IDENTIFIED BY 'user_pass';GRANT ALL PRIVILEGES ON *.* TO 'user_name'@'localhost' WITH GRANT OPTION;CREATE USER 'user_name'@'%' IDENTIFIED BY 'user_pass';GRANT ALL PRIVILEGES ON *.* TO 'user_name'@'%' WITH GRANT OPTION;

1.3.安装pyspider

本人使用Anaconda

# 搭建虚拟环境sbird python版本3.*conda create -n sbird python=3*# 进入环境source activate sbird# 安装pyspiderpip install pyspider# 报错 # it does not exist. The exported locale is "en_US.UTF-8" but it is not supported# 执行 可写入.bashrcexport LC_ALL=en_US.utf-8export LANG=en_US.utf-8#ImportError: pycurl: libcurl link-time version (7.29.0) is older than compile-time version (7.49.0)conda install pycurl# 退出source deactivate sbird# 若在虚拟机内 出现无法访问localhost:5000 可关闭防火墙systemctl stop firewalld.service#########直接运行源码==============mkdir git;cd git# 下载git clone https://github.com/binux/pyspider.git# 安装/root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py

其他方法

# 搭建虚拟环境pip install virtualenvmkdir python;cd python# 创建虚拟环境pyenv3virtualenv -p /usr/bin/python3 pyenv3# 进入虚拟环境 激活环境cd pyenv3/source ./bin/activatepip install pyspider# 若pycurl报错 yum install libcurl-devel# 继续pip install pyspider# 关闭deactivate

本人推荐用anaconda方式安装

若pyspider运行过程中出现错误,参考anaconda安装部分,至此,访问localhost:5000可看到页面。

1.4.安装Supervisor

# 安装yum install supervisor -y# 若无法检索 则添加阿里的epel源vim /etc/yum.repos.d/epel.repo# 添加以下内容[epel]name=Extra Packages for Enterprise Linux 7 - $basearchbaseurl=http://mirrors.aliyun.com/epel/7/$basearchhttp://mirrors.aliyuncs.com/epel/7/$basearchfailovermethod=priorityenabled=1gpgcheck=0gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7[epel-debuginfo]name=Extra Packages for Enterprise Linux 7 - $basearch - Debugbaseurl=http://mirrors.aliyun.com/epel/7/$basearch/debughttp://mirrors.aliyuncs.com/epel/7/$basearch/debugfailovermethod=priorityenabled=0gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7gpgcheck=0[epel-source]name=Extra Packages for Enterprise Linux 7 - $basearch - Sourcebaseurl=http://mirrors.aliyun.com/epel/7/SRPMShttp://mirrors.aliyuncs.com/epel/7/SRPMSfailovermethod=priorityenabled=0gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7gpgcheck=0# 安装yum install supervisor -y# 测试是否安装成功echo_supervisord_conf

1.4.1.Supervisor用法

supervisord #supervisor的服务器端部分 启动supervisorctl #启动supervisor的命令行窗口# 假设创建进程pyspider01vim /etc/supervisord.d/pyspider01.ini# 写入以下内容[program:pyspider01]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.pydirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/pyspider01.log# 重载supervisorctl reload# 启动supervisorctl start pyspider01# 也可这样启动supervisord -c /etc/supervisord.conf# 查看状态supervisorctl status# output pyspider01 RUNNING pid 4026, uptime 0:02:40# 关闭supervisorctl shutdown

1.5.安装redis

# 消息队列采用redismkdir download;cd downloadwget http://download.redis.io/releases/redis-3.2.4.tar.gztar xzf redis-3.2.4.tar.gzcd redis-3.2.4make# 或者直接yum安装yum -y install redis# 启动systemctl start redis.service# 重启systemctl restart redis.service# 停止systemctl stop redis.service# 查看状态systemctl status redis.service# 更改文件/etc/redis.confvim /etc/redis.conf# 更改内容daemonize no 改为 daemonize yesbind 127.0.0.1 改为 bind 10.211.55.22(当前服务器ip)# 重启redissystemctl restart redis.service

1.6.关于自启动

# Supervisor添加到自启动服务systemctl enable supervisord.service# redis添加到自启动服务systemctl enable redis.service# 关闭防火墙自启动systemctl disable firewalld.service

至此,pyspider单个服务器运行环境搭建且部署完毕,启动localhost:5000进入web界面。

也可编写脚本运行,在/pyspider/supervisor/pyspider01.log查看运行状态。

2.分布式部署

刚才配置的服务器,将其命名为centos01,按照这样的配置,再分别部署两台centos02、centos03。

如下:

服务器名称 ip 说明

centos01 10.211.55.22 redis,mariaDB, schedulercentos02 10.211.55.23 fetcher, processor, result_worker,phantomjscentos03 10.211.55.24 fetcher, processor,,result_worker,webui

2.1.centos01

进入服务器centos01,经过第一步,基本环境已经搭好,首先编辑配置文件/pyspider/config.json

{ "taskdb": "mysql+taskdb://user_name:user_pass@10.211.55.22:3306/taskdb", "projectdb": "mysql+projectdb://user_name:user_pass@10.211.55.22:3306/projectdb", "resultdb": "mysql+resultdb://user_name:user_pass@10.211.55.22:3306/resultdb", "message_queue": "redis://10.211.55.22:6379/db", "logging-config": "/pyspider/logging.conf", "phantomjs-proxy":"10.211.55.23:25555", "webui": { "username": "", "password": "", "need-auth": false, "host":"10.211.55.24", "port":"5000", "scheduler-rpc":"http:// 10.211.55.22:5002", "fetcher-rpc":"http://10.211.55.23:5001" }, "fetcher": { "xmlrpc":true, "xmlrpc-host": "0.0.0.0", "xmlrpc-port": "5001" }, "scheduler": { "xmlrpc":true, "xmlrpc-host": "0.0.0.0", "xmlrpc-port": "5002" }}

尝试运行下:

/root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json scheduler# 报错ImportError: No module named 'mysql'# 下载 mysql-connector-pythoncd ~/git/git clone https://github.com/mysql/mysql-connector-python.git# 安装source activate sbirdcd mysql-connector-pythonpython setup.py install# 安装redispip install redissource deactivate# 运行/root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json scheduler# 输出 ok[I 161010 15:57:25 scheduler:644] scheduler starting...[I 161010 15:57:25 scheduler:779] scheduler.xmlrpc listening on 0.0.0.0:5002[I 161010 15:57:25 scheduler:583] in 5m: new:0,success:0,retry:0,failed:0

运行成功后,可直接更改/etc/supervisord.d/pyspider01.ini如下:

[program:pyspider01]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json schedulerdirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/pyspider01.log# 重载supervisorctl reload# 查看状态supervisorctl status

centos01部署完毕。

2.2.centos02

在centos02中,需要运行result_worker、processor、phantomjs、fetcher

分别建立文件:

/etc/supervisord.d/result_worker.ini[program:result_worker]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json result_workerdirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/result_worker.log/etc/supervisord.d/processor.ini[program:processor]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json processordirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/processor.log/etc/supervisord.d/phantomjs.ini[program:phantomjs]command = /pyspider/phantomjs --config=/pyspider/pjsconfig.json /pyspider/phantomjs_fetcher.js 25555directory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/phantomjs.log/etc/supervisord.d/fetcher.ini[program:fetcher]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json fetcherdirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/fetcher.log

在pyspider目录中建立pjsconfig.json

{ "ignoreSslErrors": true, "sslprotocol": "any", "outputEncoding": "utf8", cookiesfile="pyspider/phontjscookies.txt", autoLoadImages = false}

下载phantomjs至/pyspider/文件夹,将git/pyspider/pyspider/fetcher/phantomjs_fetcher.js复制到phantomjs_fetcher.js

# 重载supervisorctl reload# 查看状态supervisorctl status# outputfetcher RUNNING pid 3446, uptime 0:00:07phantomjs RUNNING pid 3448, uptime 0:00:07processor RUNNING pid 3447, uptime 0:00:07result_worker RUNNING pid 3445, uptime 0:00:07

centos02部署完毕。

2.3.centos03

部署这三个进程fetcher, processor, result_worker和centos02 一样,本服务器主要是在前面的基础上加上webui

建立文件:

/etc/supervisord.d/webui.ini[program:webui]command = /root/anaconda3/envs/sbird/bin/python /root/git/pyspider/run.py -c /pyspider/config.json webuidirectory = /root/git/pyspideruser = rootprocess_name = %(program_name)sautostart = trueautorestart = truestartsecs = 3redirect_stderr = truestdout_logfile_maxbytes = 500MBstdout_logfile_backups = 10stdout_logfile = /pyspider/supervisor/webui.log# 重载supervisorctl reload# 查看状态supervisorctl status# outputfetcher RUNNING pid 2724, uptime 0:00:07processor RUNNING pid 2725, uptime 0:00:07result_worker RUNNING pid 2723, uptime 0:00:07webui RUNNING pid 2726, uptime 0:00:07

3.总结

【相关推荐】

1. Python免费视频教程

2. Python学习手册

3. Python面向对象视频教程

以上就是强大的网络爬虫系统:pyspider的详细内容,更多请关注php中文网其它相关文章!

来源:php中文网

免责声明:本文由用户上传,如有侵权请联系删除!