从网站下载包含数组的JSON响应。

huangapple go评论108阅读模式
英文:

Download JSON response from website with array

问题

我正试图从一个具有不同结尾的网站获取JSON响应,例如:

  1. http//example.com/111
  2. http//example.com/112
  3. http//example.com/113
  4. ...
  5. http//example.com/200

每个请求都会给我一个具有相同列但不同数据的JSON:

  1. {
  2. "uid": 111,
  3. "name": "Pinky",
  4. "title": "Mouse",
  5. "type": "No brain"
  6. }
  7. {
  8. "uid": 112,
  9. "name": "Brain",
  10. "title": "Big Mouse",
  11. "type": "Brain"
  12. }
  13. {
  14. "uid": 113,
  15. "name": "Garfield",
  16. "title": "Cat",
  17. "type": "lazy"
  18. }

我在一个Python脚本中遇到了问题,该脚本获取所有JSON数据并将其合并为一个来自定义数组的数据,例如:[111 - 200]

  1. [
  2. {
  3. "uid": 111,
  4. "name": "Pinky",
  5. "title": "Mouse",
  6. "type": "No brain"
  7. },
  8. {
  9. "uid": 112,
  10. "name": "Brain",
  11. "title": "Big Mouse",
  12. "type": "Brain"
  13. },
  14. {
  15. "uid": 113,
  16. "name": "Garfield",
  17. "title": "Cat",
  18. "type": "lazy"
  19. }
  20. ]
英文:

I'm trying to get a JSON response from a website that has a different ending, for example:

  1. http//example.com/111
  2. http//example.com/112
  3. http//example.com/113
  4. ...
  5. http//example.com/200

Every request gives me JSON with the same columns but different data:

  1. {
  2. "uid": 111,
  3. "name": "Pinky",
  4. "title": "Mouse",
  5. "type": "No brain"
  6. }
  7. {
  8. "uid": 112,
  9. "name": "Brain",
  10. "title": "Big Mouse",
  11. "type": "Brain"
  12. }
  13. {
  14. "uid": 113,
  15. "name": "Garfield",
  16. "title": "Cat",
  17. "type": "lazy"
  18. }

I'm having trouble with a Python script that get all the JSON data and unites it into one from the defined array, for example: [111 - 200]

  1. [
  2. {
  3. "uid": 111,
  4. "name": "Pinky",
  5. "title": "Mouse",
  6. "type": "No brain"
  7. },
  8. {
  9. "uid": 112,
  10. "name": "Brain",
  11. "title": "Big Mouse",
  12. "type": "Brain"
  13. },
  14. {
  15. "uid": 113,
  16. "name": "Garfield",
  17. "title": "Cat",
  18. "type": "lazy"
  19. }
  20. ]

Please assist.

答案1

得分: 0

你可以将结果存储在一个列表中,然后一次性保存为JSON文件。

这里以PokeAPI为例:

  1. from bs4 import BeautifulSoup
  2. import json
  3. import requests
  4. pokemons = []
  5. for poke_id in range(1, 7):
  6. url = f'https://pokeapi.co/api/v2/pokemon/{poke_id}'
  7. data = requests.get(url).json()
  8. data = {
  9. 'id': data.pop('id'),
  10. 'types': data.pop('types'),
  11. 'base_experience': data.pop('base_experience')
  12. } # 仅筛选了一些属性
  13. print(data)
  14. pokemons.append(data)
  15. with open('pokemon.json', 'w') as f:
  16. json.dump(pokemons, f)

每次循环迭代时,我都将一个网页的结果添加到列表中,最后将其写入JSON文件。

以下是输出结果:

  1. [
  2. {
  3. "id": 1,
  4. "types": [
  5. {
  6. "slot": 1,
  7. "type": {
  8. "name": "grass",
  9. "url": "https://pokeapi.co/api/v2/type/12/"
  10. }
  11. },
  12. {
  13. "slot": 2,
  14. "type": {
  15. "name": "poison",
  16. "url": "https://pokeapi.co/api/v2/type/4/"
  17. }
  18. }
  19. ],
  20. "base_experience": 64
  21. },
  22. {
  23. "id": 2,
  24. "types": [
  25. {
  26. "slot": 1,
  27. "type": {
  28. "name": "grass",
  29. "url": "https://pokeapi.co/api/v2/type/12/"
  30. }
  31. },
  32. {
  33. "slot": 2,
  34. "type": {
  35. "name": "poison",
  36. "url": "https://pokeapi.co/api/v2/type/4/"
  37. }
  38. }
  39. ],
  40. "base_experience": 142
  41. },
  42. {
  43. "id": 3,
  44. "types": [
  45. {
  46. "slot": 1,
  47. "type": {
  48. "name": "grass",
  49. "url": "https://pokeapi.co/api/v2/type/12/"
  50. }
  51. },
  52. {
  53. "slot": 2,
  54. "type": {
  55. "name": "poison",
  56. "url": "https://pokeapi.co/api/v2/type/4/"
  57. }
  58. }
  59. ],
  60. "base_experience": 263
  61. },
  62. {
  63. "id": 4,
  64. "types": [
  65. {
  66. "slot": 1,
  67. "type": {
  68. "name": "fire",
  69. "url": "https://pokeapi.co/api/v2/type/10/"
  70. }
  71. }
  72. ],
  73. "base_experience": 62
  74. },
  75. {
  76. "id": 5,
  77. "types": [
  78. {
  79. "slot": 1,
  80. "type": {
  81. "name": "fire",
  82. "url": "https://pokeapi.co/api/v2/type/10/"
  83. }
  84. }
  85. ],
  86. "base_experience": 142
  87. },
  88. {
  89. "id": 6,
  90. "types": [
  91. {
  92. "slot": 1,
  93. "type": {
  94. "name": "fire",
  95. "url": "https://pokeapi.co/api/v2/type/10/"
  96. }
  97. },
  98. {
  99. "slot": 2,
  100. "type": {
  101. "name": "flying",
  102. "url": "https://pokeapi.co/api/v2/type/3/"
  103. }
  104. }
  105. ],
  106. "base_experience": 267
  107. }
  108. ]
英文:

You can store the results in a list and save them to JSON at once.

I am taking PokeAPI as an example here.

  1. from bs4 import BeautifulSoup
  2. import json
  3. import requests
  4. pokemons = []
  5. for poke_id in range(1, 7):
  6. url = f'https://pokeapi.co/api/v2/pokemon/{poke_id}'
  7. data = requests.get(url).json()
  8. data = {
  9. 'id':data.pop('id'),
  10. 'types': data.pop('types'),
  11. 'base_experience': data.pop('base_experience')
  12. } # to filter a few attributes only
  13. print(data)
  14. pokemons.append(data)
  15. with open('pokemon.json', 'w') as f:
  16. json.dump(pokemons, f)

With each iteration of loop, I am appending a webpage's result to the list, and at last writing to the JSON file.

This is the output:

  1. [
  2. {
  3. "id": 1,
  4. "types": [
  5. {
  6. "slot": 1,
  7. "type": {
  8. "name": "grass",
  9. "url": "https://pokeapi.co/api/v2/type/12/"
  10. }
  11. },
  12. {
  13. "slot": 2,
  14. "type": {
  15. "name": "poison",
  16. "url": "https://pokeapi.co/api/v2/type/4/"
  17. }
  18. }
  19. ],
  20. "base_experience": 64
  21. },
  22. {
  23. "id": 2,
  24. "types": [
  25. {
  26. "slot": 1,
  27. "type": {
  28. "name": "grass",
  29. "url": "https://pokeapi.co/api/v2/type/12/"
  30. }
  31. },
  32. {
  33. "slot": 2,
  34. "type": {
  35. "name": "poison",
  36. "url": "https://pokeapi.co/api/v2/type/4/"
  37. }
  38. }
  39. ],
  40. "base_experience": 142
  41. },
  42. {
  43. "id": 3,
  44. "types": [
  45. {
  46. "slot": 1,
  47. "type": {
  48. "name": "grass",
  49. "url": "https://pokeapi.co/api/v2/type/12/"
  50. }
  51. },
  52. {
  53. "slot": 2,
  54. "type": {
  55. "name": "poison",
  56. "url": "https://pokeapi.co/api/v2/type/4/"
  57. }
  58. }
  59. ],
  60. "base_experience": 263
  61. },
  62. {
  63. "id": 4,
  64. "types": [
  65. {
  66. "slot": 1,
  67. "type": {
  68. "name": "fire",
  69. "url": "https://pokeapi.co/api/v2/type/10/"
  70. }
  71. }
  72. ],
  73. "base_experience": 62
  74. },
  75. {
  76. "id": 5,
  77. "types": [
  78. {
  79. "slot": 1,
  80. "type": {
  81. "name": "fire",
  82. "url": "https://pokeapi.co/api/v2/type/10/"
  83. }
  84. }
  85. ],
  86. "base_experience": 142
  87. },
  88. {
  89. "id": 6,
  90. "types": [
  91. {
  92. "slot": 1,
  93. "type": {
  94. "name": "fire",
  95. "url": "https://pokeapi.co/api/v2/type/10/"
  96. }
  97. },
  98. {
  99. "slot": 2,
  100. "type": {
  101. "name": "flying",
  102. "url": "https://pokeapi.co/api/v2/type/3/"
  103. }
  104. }
  105. ],
  106. "base_experience": 267
  107. }
  108. ]

huangapple
  • 本文由 发表于 2023年7月14日 03:40:25
  • 转载请务必保留本文链接:https://go.coder-hub.com/76682732.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定