欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

爬虫项目实战八:爬取天气情况

程序员文章站 2022-07-14 15:51:08
...

目标

根据天气接口,爬取接下来一周的天气情况。

项目准备

软件:Pycharm
第三方库:requests,BeautifulSoup,csv
接口地址:http://api.k780.com:88/?app=weather.future&weaid=城市名&appkey=10003&sign=b59bc3ef6191eb9f747dd4e83c99f2a4&format=xml

接口分析

http://api.k780.com:88/?app=weather.future&weaid=上海&appkey=10003&sign=b59bc3ef6191eb9f747dd4e83c99f2a4&format=xml

打开这个网页。
爬虫项目实战八:爬取天气情况

代码实现

import requests
from bs4 import BeautifulSoup

print('请输入城市名称:')
cityname=input()
host='http://api.k780.com:88/?app=weather.future&weaid=%s'%cityname
url=host+'&appkey=10003&sign=b59bc3ef6191eb9f747dd4e83c99f2a4&format=xml'
response=requests.get(url)
content=response.text
soup=BeautifulSoup(content,'lxml')
#日期
day=[]
target=soup.find_all('days')
for each in target:
    day.append(each.text)
#星期
week=[]
target=soup.find_all('week')
for each in target:
    week.append(each.text)
#城市
city=[]
target=soup.find_all('citynm')
for each in target:
    city.append(each.text)
#温度
temperature=[]
target=soup.find_all('temperature')
for each in target:
    temperature.append(each.text)
#天气状况
weather=[]
target=soup.find_all('weather')
for each in target:
    weather.append(each.text)
#风向
wind=[]
target=soup.find_all('wind')
for each in target:
    wind.append(each.text)
#风力
winp=[]
target=soup.find_all('winp')
for each in target:
    winp.append(each.text)
length=len(day)
for i in range(length):
    print(day[i],week[i],city[i],temperature[i],weather[i],wind[i],winp[i])

效果显示

爬虫项目实战八:爬取天气情况

写入本地

    with open('F:/pycharm文件/document/data.csv', 'a', newline='') as f:
        csvwriter = csv.writer(f, delimiter=',')
        csvwriter.writerow([day[i], week[i],city[i],temperature[i],weather[i],wind[i],winp[i]])

爬虫项目实战八:爬取天气情况
完整代码如下:

import requests
from bs4 import BeautifulSoup
import csv
print('请输入城市名称:')
cityname=input()
host='http://api.k780.com:88/?app=weather.future&weaid=%s'%cityname
url=host+'&appkey=10003&sign=b59bc3ef6191eb9f747dd4e83c99f2a4&format=xml'
response=requests.get(url)
content=response.text
soup=BeautifulSoup(content,'lxml')
#日期
day=[]
target=soup.find_all('days')
for each in target:
    day.append(each.text)
#星期几
week=[]
target=soup.find_all('week')
for each in target:
    week.append(each.text)
#城市
city=[]
target=soup.find_all('citynm')
for each in target:
    city.append(each.text)
#温度
temperature=[]
target=soup.find_all('temperature')
for each in target:
    temperature.append(each.text)
#天气状况
weather=[]
target=soup.find_all('weather')
for each in target:
    weather.append(each.text)
#风向
wind=[]
target=soup.find_all('wind')
for each in target:
    wind.append(each.text)
#风力
winp=[]
target=soup.find_all('winp')
for each in target:
    winp.append(each.text)
length=len(day)
for i in range(length):
    print(day[i],week[i],city[i],temperature[i],weather[i],wind[i],winp[i])
    with open('F:/pycharm文件/document/data.csv', 'a', newline='') as f:
        csvwriter = csv.writer(f, delimiter=',')
        csvwriter.writerow([day[i], week[i],city[i],temperature[i],weather[i],wind[i],winp[i]])

声明:仅做自己学习参考使用。

相关标签: 爬虫学习笔记