How do I maintain logged in session with golang for scraping?

huangapple go评论91阅读模式
英文:

How do I maintain logged in session with golang for scraping?

问题

我正在尝试使用Go语言从一个需要用户/密码登录的网站上爬取数据。使用Python的requests库可以很简单地实现这一点:

import requests

session = requests.Session()
session.post("https://site.com/login", data={'username': 'user', 'password': '123456'})

# 访问需要身份验证的URL
resp = session.get('https://site.com/restricted/url')

请问如何用Go语言实现相同的功能呢?谢谢。

英文:

I'm trying to scrape data from a website that requires user/password login using go. With python this is simple using requests lib:

import requests

session = requests.Session()
session.post("https://site.com/login", data={ 'username': 'user', 'password': '123456' })

# access URL that requires authentication
resp = session.get('https://site.com/restricted/url')

What is a simple way to accomplish the same thing with golang? thanks.

答案1

得分: 6

创建一个自定义的HTTP Client 实例,并将一个 cookie jar 附加到它上面。

英文:

Create a custom HTTP Client instance and attach a cookie jar to it.

答案2

得分: 2

我写了一个名为Colly的爬虫框架,它可以直接处理HTTP会话。你可以通过以下方式实现上述功能:

c := colly.NewCollector()
c.Post("https://example.com/login", map[string]string{"user": "x", "pass": "y"})

你可以在GitHub上找到这段代码。还有一个完整的处理身份验证的示例也是可用的

英文:

I wrote a scraping framework called Colly which handles HTTP sessions out of the box. You can achieve the mentioned functionality similarly:

c := colly.NewCollector()
c.Post("https://example.com/login", map[string]string{"user": "x", "pass": "y"})

The code can be found on GitHub.
A complete example of handling authentications is also available.

huangapple
  • 本文由 发表于 2013年12月1日 19:51:54
  • 转载请务必保留本文链接:https://go.coder-hub.com/20311803.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定