requests是用python語言基於urllib編寫的,採用的是apache2 licensed開源協議的http庫。與urllib相比,requests更加方便,可以節約我們大量的工作,建議爬蟲使用requests庫。
1. 發起基本get請求並傳入引數
import requests
defget_html
(url)
:# 這個傳入的引數是拼接在url中的一組引數
param =
html = requests.get(url,params=param)
if html.status_code ==
200:
html.encoding=
"utf8"
print
(html.text)
else
:print
("error"
,html)
if __name__ ==
'__main__'
: url =
""get_html(url)
2.發起post請求並提交請求體# post傳送表單
import requests
defpost_html
(url)
: data =
html = requests.post(url, data=data)
if html.status_code ==
200:
html.encoding =
"utf8"
print
(html.text)
else
:print
("get error:"
+ html.url)
if __name__ ==
'__main__'
: url =
""post_html(url)
3.獲取json資料# 獲取json資料
post_url(url)
5.使用session維持會話# 使用session維持會話,減少頻繁發起請求
post_url(url)
6.忽略https證書驗證# 忽略https證書驗證
import requests
defget_url
(url)
: html = requests.get(url,verify=
false
)if html.status_code ==
200:
print
("ok"
)else
:print
("error:"
+ url)
if __name__ ==
'__main__'
: url =
""get_url(url)
7.使用ip**# 使用ip**
import requests
proxies =
defget_url
(url)
: html = requests.get(url, proxies=proxies, timeout=
3,verify=
false
)if html.status_code ==
200:
print
("ok"
)else
:print
("get error:"
+ url)
if __name__ ==
'__main__'
: url =
""get_url(url)
8.post上傳檔案# post上傳檔案
import requests
file=)
}def
post_html
(url)
: html = requests.post(url, files=
file
)if html.status_code ==
200:
html.encoding =
"utf8"
print
(html.text)
else
:print
("get error:"
+ html.url)
if __name__ ==
'__main__'
: url =
""post_html(url)
離線安裝python requests庫
requests 2.19.1 certifi required 2017.4.17,installed 2018.4.16 ca認證模組 chardet required 3.1.0,3.0.2,installed 3.0.4 通用字元編碼檢測器模組 idna required 2.8,2.5,i...
python Requests庫入門(一)
1 搜尋cmd,以管理員的身份執行 2 輸入pip install requests 截圖所示便代表已安裝好 3 開啟idle進行簡單測試 import requests r requests.get r.status code 200 此時返回值狀態碼應是200,200表示訪問成功 下面繼續 r....
python requests庫的使用
如果auth使用不通過的話,可以再header加入cookie header data request請求 response requests.request get url,params params,headers headers,auth auth session 請求 session req...