Python code to extract the max pain value of a SBIN stock from https://www.niftytrader.in/stock-options-chart/sbin not working – what am I missing?

huangapple go评论69阅读模式
英文:

Python code to extract the max pain value of a SBIN stock from https://www.niftytrader.in/stock-options-chart/sbin not working - what am I missing?

问题

I want python code to scrape max pain value of a SBIN Stock from url https://www.niftytrader.in/stock-options-chart/sbin

我的代码是..

import requests
from bs4 import BeautifulSoup

url = "https://www.niftytrader.in/stock-options-chart/sbin"

response = requests.get(url)
soup = BeautifulSoup(response.content, "html.parser")

max_pain_element = soup.find("span", class_="fs-3 label-color-3")

if max_pain_element:
    # Extract the Max Pain value
    max_pain = max_pain_element.text.strip()
    print("Max Pain:", max_pain)
else:
    print("Max Pain value not found on the webpage.")

但是,我得到的输出是"Max Pain value not found on the webpage."

英文:

I want python code to scrape max pain value of a SBIN Stock from url https://www.niftytrader.in/stock-options-chart/sbin

My code is ..

import requests
from bs4 import BeautifulSoup

url = "https://www.niftytrader.in/stock-options-chart/sbin"

response = requests.get(url)
soup = BeautifulSoup(response.content, "html.parser")

max_pain_element = soup.find("span", class_="fs-3 label-color-3")

if max_pain_element:
    # Extract the Max Pain value
    max_pain = max_pain_element.text.strip()
    print("Max Pain:", max_pain)
else:
    print("Max Pain value not found on the webpage.")

But, i am getting output as "Max Pain value not found on the webpage"

答案1

得分: 0

以下是翻译好的部分:

页面上显示的数据是通过JavaScript从外部URL加载的所以`beautifulsoup`看不到它要将所有数据加载到一个数据框中你可以使用以下示例

```py
import requests
import pandas as pd

api_url = 'https://h9cg992bof.execute-api.ap-south-1.amazonaws.com/webapi/symbol/psymbol-list'

data = requests.get(api_url).json()
df = pd.DataFrame(data['resultData'])
print(df)

输出:

    symbol_name  today_close  prev_close  max_pain  lot_size
0      AARTIIND       494.70      495.65     500.0     850.0
1           ABB      3894.75     3897.50    3700.0     250.0
2    ABBOTINDIA     20898.45    20970.55   21000.0      40.0
3     ABCAPITAL       163.90      164.80     165.0    5400.0
4         ABFRL       190.85      193.70     200.0    2600.0
5           ACC      1729.05     1712.45    1760.0     250.0
6      ADANIENT      1956.05     1890.00    1900.0     250.0
7    ADANIPORTS       688.10      664.95     690.0     625.0
8         ALKEM      3302.10     3326.30    3400.0     200.0

...

要获取仅SBIN的值:

print(df[df.symbol_name == 'SBIN'].iloc[0]['max_pain'])

输出:

580.0

<details>
<summary>英文:</summary>

The data you see on the page is loaded from external URL via JavaScript, so `beautifulsoup` doesn&#39;t see it. To get all data into a dataframe you can use next example:

```py
import requests
import pandas as pd

api_url = &#39;https://h9cg992bof.execute-api.ap-south-1.amazonaws.com/webapi/symbol/psymbol-list&#39;

data =requests.get(api_url).json()
df = pd.DataFrame(data[&#39;resultData&#39;])
print(df)

Prints:

    symbol_name  today_close  prev_close  max_pain  lot_size
0      AARTIIND       494.70      495.65     500.0     850.0
1           ABB      3894.75     3897.50    3700.0     250.0
2    ABBOTINDIA     20898.45    20970.55   21000.0      40.0
3     ABCAPITAL       163.90      164.80     165.0    5400.0
4         ABFRL       190.85      193.70     200.0    2600.0
5           ACC      1729.05     1712.45    1760.0     250.0
6      ADANIENT      1956.05     1890.00    1900.0     250.0
7    ADANIPORTS       688.10      664.95     690.0     625.0
8         ALKEM      3302.10     3326.30    3400.0     200.0

...

To get only SBIN value:

print(df[df.symbol_name == &#39;SBIN&#39;].iloc[0][&#39;max_pain&#39;])

Prints:

580.0

huangapple
  • 本文由 发表于 2023年5月21日 02:09:53
  • 转载请务必保留本文链接:https://go.coder-hub.com/76296709.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定