[Python] 写个问卷星答题的爬虫脚本,试了下基本通用,供参考学习



[Python] 写个问卷星答题的爬虫脚本,试了下基本通用,供参考学习

作者: cj13888

全网最全的网络资源分享网站

手机扫码查看

标签:

Python

特别声明:本站资源分别为免费资源、查看密码或解压密码资源、三个级别VIP会员资源。本站所有发布的内容都是安全,请放心下载!本站不可能所有资源都可以商业用途,原创或部分除外!如:商业用途请联系原作者购买正版。与本站无关!若侵犯到您的权益,请联系本站删除,我们将及时处理!

分享

代码粘贴显示有问题,改了好几次,这次应该可以了
        今天单位让答题,打开链接准备答题,提示:电脑上面答不了,还只能微信扫码答题,看了下网页的内容,是问卷星的答题卷;
        题目和选项都在页面里面写好了,所以就简单写了个爬虫代码,用了beautifulsoup、requests,另外存储抓取内容用了mysql数据库;
        题目的选项个数不一定统一,所以数据大家使用的时候,可以直接把抓取到的题目都放到一个字段里面存着,后面要用的时候,直接读出来,遍历就行;
        再就是,这网站暂时没有防爬机制,所以直接while循环遍历那种随机题目的地址,就能刷出来整个题库的题目(稍微改改代码就能爬整个网站的题库了);
        代码比较简单,简单分享,仅供学习~

# -*- coding: utf-8 -*-
import time
 
import requests
import random
 
import schedule
from bs4 import BeautifulSoup
import pymysql
 
mysql_host = '127.0.0.1'
mysql_db = 'wenjuanxing'
mysql_user = 'root'
mysql_password = '123'
mysql_port = 3306
pages = []
 
 
def ua(refer_str):
    user = [
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; AcooBrowser; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; Acoo Browser; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; .NET CLR 3.0.04506)",
        "Mozilla/4.0 (compatible; MSIE 7.0; AOL 9.5; AOLBuild 4337.35; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)",
        "Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US)",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 2.0.50727; Media Center PC 6.0)",
        "Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 1.0.3705; .NET CLR 1.1.4322)",
        "Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 3.0.04506.30)",
        "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/523.15 (KHTML, like Gecko, Safari/419.3) Arora/0.3 (Change: 287 c9dfb30)",
        "Mozilla/5.0 (X11; U; Linux; en-US) AppleWebKit/527+ (KHTML, like Gecko, Safari/419.3) Arora/0.6",
        "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.2pre) Gecko/20070215 K-Ninja/2.1.1",
        "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9) Gecko/20080705 Firefox/3.0 Kapiko/3.0",
        "Mozilla/5.0 (X11; Linux i686; U;) Gecko/20070322 Kazehakase/0.4.5",
        "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.8) Gecko Fedora/1.9.0.8-1.fc10 Kazehakase/0.5.6",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11",
        "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/535.20 (KHTML, like Gecko) Chrome/19.0.1036.7 Safari/535.20",
        "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; fr) Presto/2.9.168 Version/11.52",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.11 TaoBrowser/2.0 Safari/536.11",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.71 Safari/537.1 LBBROWSER",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; LBBROWSER)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E; LBBROWSER)",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.84 Safari/535.11 LBBROWSER",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E)",
        "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; QQBrowser/7.0.3698.400)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SV1; QQDownload 732; .NET4.0C; .NET4.0E; 360SE)",
        "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; QQDownload 732; .NET4.0C; .NET4.0E)",
        "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E)",
        "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1",
        "Mozilla/5.0 (iPad; U; CPU OS 4_2_1 like Mac OS X; zh-cn) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5",
        "Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b13pre) Gecko/20110307 Firefox/4.0b13pre",
        "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:16.0) Gecko/20100101 Firefox/16.0",
        "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.64 Safari/537.11",
        "Mozilla/5.0 (X11; U; Linux x86_64; zh-CN; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10"
    ]
    headers = {'User-Agent': random.choice(user)}
    if refer_str:
        headers.update(Referer=refer_str)
    return headers
 
 
def db_insert(title, a, b, c):
    sql_s = '''insert ignore into questions (title, answer_a, answer_b, answer_c) values(%s, %s, %s, %s)'''
    db = pymysql.connect(host=mysql_host, port=mysql_port, user=mysql_user, password=mysql_password, db=mysql_db, charset='utf8')
    cursor = db.cursor()
    cursor.execute(sql_s, (title, a, b, c))
    db.commit()
    db.close()
 
 
def html_get(url, header):
    with requests.Session() as s:
        s.keep_alive = False
        html = ""
        while html == "":
            try:
                res_get = s.get(url, headers=header)
                # stream=True
                print(res_get.status_code)
                html = res_get.content
                return res_get.content
            except Exception as e:
                print(e)
                print('下载出错: %s' % url)
                continue
 
 
def get_wjx(wj_url):
    html_str = html_get(wj_url, ua(refer_str=False))
    bs_str_all = BeautifulSoup(str(html_str, 'utf-8'), 'html.parser').findAll("div", attrs={"class": "field ui-field-contain"})
    # print(bs_str_all)
    for item in bs_str_all:
        t = ''
        ss = []
        title = item.findAll('div', attrs={'class': 'field-label'})
        sections = item.findAll('div', attrs={'class': 'ui-radio'})
        # if title[0].get_text() == '基本信息:*':
        #     print("跳过该记录")
        # else:
        for t in title:
            t = t.get_text()
        for section in sections:
            ss.append(section.get_text())
        if len(ss) < 1:
            print('跳过空白题目')
        else:
            # s_a = ss[0]
            # s_b = ss[1]
            # s_c = ss[2]
            print(t)
            print(ss)
            # 开始插入数据库
            # db_insert(title=t, a=s_a, b=s_b, c=s_c)
 
 
def job():
    print('开始抓取')
    while True:
        url1 = 'https://ks.wjx.top/m/68258190.aspx'
        get_wjx(wj_url=url1)
        time.sleep(3)
 
 
if __name__ == '__main__':
    # 调试执行
    job()
 
    # 循环每三秒抓取一次数据
    # while True:
    #     job()
    #     time.sleep(3)[color=#808080]
[/color]
分享到:
打赏
未经允许不得转载:

作者: cj13888, 转载或复制请以 超链接形式 并注明出处 易启发资源网
原文地址: 《[Python] 写个问卷星答题的爬虫脚本,试了下基本通用,供参考学习》 发布于2020-9-23

评论


切换注册

登录

忘记密码?

您也可以使用第三方帐号快捷登录

切换登录

注册

[Python] 写个问卷星答题的爬虫脚本,试了下基本通用,供参考学习

长按图片转发给朋友

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏