vscode PermissionError: [Errno 13] Permission denied:

huangapple go评论143阅读模式
英文:

vscode PermissionError: [Errno 13] Permission denied:

问题

这个错误似乎随处可见,但我找不到一个适合我的情况的解决方案。曾经可以访问该文件夹,即使删除了代码中的该部分后,该文件夹仍然出现错误。已尝试重新启动、以管理员身份运行VS,并确认用户对文件夹具有完全访问权限。我必须在某个时候对代码进行了一些更改,导致它出现了问题,但一切看起来都是正确的。为什么它会调用代码中没有的文件夹?在我导入的secrets中也没有引用。

错误信息:

  1. PermissionError: [Errno 13] Permission denied: './aws_s3/archive'

代码:

  1. import boto3
  2. import os
  3. import ignore.secrets as secrets
  4. import shutil
  5. # 建立连接
  6. s3 = boto3.client(
  7. 's3',
  8. region_name='us-east-1',
  9. aws_access_key_id=secrets.AWS_ACCESS_KEY_ID,
  10. aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
  11. )
  12. # 创建存储桶
  13. bucket_name = 'well-logs'
  14. response = s3.create_bucket(
  15. Bucket=bucket_name
  16. )
  17. print(response)
  18. # 上传文件
  19. csv_directory = './aws_s3/'
  20. for filename in os.listdir(csv_directory):
  21. file_path = os.path.join(csv_directory, filename)
  22. s3.upload_file(file_path, bucket_name, filename)
  23. # 确认上传成功
  24. response = s3.list_objects(Bucket=bucket_name)
  25. objects = response.get('Contents', [])
  26. if objects:
  27. for obj in objects:
  28. print(obj['Key'])
  29. else:
  30. print("Bucket中没有找到对象。")

已删除的代码部分(即使没有此部分仍然出现错误):

  1. # 存档文件
  2. home_directory = './aws_s3/'
  3. archive_directory = './aws_s3/archive/'
  4. if not os.path.exists(archive_directory):
  5. os.makedirs(archive_directory)
  6. for filename in os.listdir(home_directory):
  7. source_file = os.path.join(home_directory, filename)
  8. destination_folder = archive_directory
  9. shutil.move(source_file, destination_folder)

即使在脚本中删除了存档部分,仍然出现完整的错误:

  1. {'ResponseMetadata': {'RequestId': 'KRXB56SRP0ZSNRXY', 'HostId': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'x-amz-request-id': 'KRXB56SRP0ZSNRXY', 'date': 'Sun, 04 Jun 2023 21:25:51 GMT', 'location': '/well-logs', 'server': 'AmazonS3', 'content-length': '0'}, 'RetryAttempts': 0}, 'Location': '/well-logs'}
  2. Traceback (most recent call last):
  3. File "d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py", line 28, in <module>
  4. s3.upload_file(file_path, bucket_name, filename)
  5. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py", line 143, in upload_file
  6. return transfer.upload_file(
  7. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py", line 292, in upload_file
  8. future.result()
  9. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 103, in result
  10. return self._coordinator.result()
  11. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 266, in result
  12. raise self._exception
  13. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 139, in __call__
  14. return self._execute_main(kwargs)
  15. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 162, in _execute_main
  16. return_value = self._main(**kwargs)
  17. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py", line 758, in _main
  18. client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  19. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 530, in _api_call
  20. return this._make_api_call(operation_name, kwargs)
  21. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 933, in _make_api_call
  22. handler, event_response = this.meta.events.emit_until_response(
  23. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 416, in emit_until_response
  24. return this._emitter.emit_until_response(aliased_event_name, **kwargs)
  25. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 271, in emit_until_response
  26. responses are emitted.
  27. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 239, in _emit
  28. response = handler(**kwargs)
  29. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3088, in conditionally_calculate_md5
  30. md5_digest = calculate_md5(body, **kwargs)
  31. File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line
英文:

This error seems to be everywhere but I can't find a solution that matches my situation. At one point it could access the folder. Further, I am still getting the error for the folder even when that section of the code was removed. Restarted, ran vs as admin, and confirmed user has full access to the folders. I must have made some changes to the code to break it at some point but everything looks correct. Why is it even calling a folder that's not in the code?! There is no reference in the secrets that I imported either.

error:

  1. PermissionError: [Errno 13] Permission denied: &#39;./aws_s3/archive&#39;

code:

  1. import boto3
  2. import os
  3. import ignore.secrets as secrets
  4. import shutil
  5. # establish a connection
  6. s3 = boto3.client(
  7. &#39;s3&#39;,
  8. region_name=&#39;us-east-1&#39;,
  9. aws_access_key_id=secrets.AWS_ACCESS_KEY_ID,
  10. aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
  11. )
  12. #Create bucket
  13. bucket_name = &#39;well-logs&#39;
  14. response = s3.create_bucket(
  15. Bucket=bucket_name
  16. )
  17. print(response)
  18. # upload files
  19. csv_directory = &#39;./aws_s3/&#39;
  20. for filename in os.listdir(csv_directory):
  21. file_path = os.path.join(csv_directory, filename)
  22. s3.upload_file(file_path, bucket_name, filename)
  23. # Confirm successful upload
  24. response = s3.list_objects(Bucket=bucket_name)
  25. objects = response.get(&#39;Contents&#39;, [])
  26. if objects:
  27. for obj in objects:
  28. print(obj[&#39;Key&#39;])
  29. else:
  30. print(&quot;No objects found in the bucket.&quot;)

removed section of code (still getting errors without this):

  1. # Archive files
  2. home_directory = &#39;./aws_s3/&#39;
  3. archive_directory = &#39;./aws_s3/archive/&#39;
  4. if not os.path.exists(archive_directory):
  5. os.makedirs(archive_directory)
  6. for filename in os.listdir(home_directory):
  7. source_file = os.path.join(home_directory, filename)
  8. destination_folder = archive_directory
  9. shutil.move(source_file, destination_folder)

The full error even without the archive section in the script:

  1. {&#39;ResponseMetadata&#39;: {&#39;RequestId&#39;: &#39;KRXB56SRP0ZSNRXY&#39;, &#39;HostId&#39;: &#39;dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=&#39;, &#39;HTTPStatusCode&#39;: 200, &#39;HTTPHeaders&#39;: {&#39;x-amz-id-2&#39;: &#39;dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=&#39;, &#39;x-amz-request-id&#39;: &#39;KRXB56SRP0ZSNRXY&#39;, &#39;date&#39;: &#39;Sun, 04 Jun 2023 21:25:51 GMT&#39;, &#39;location&#39;: &#39;/well-logs&#39;, &#39;server&#39;: &#39;AmazonS3&#39;, &#39;content-length&#39;: &#39;0&#39;}, &#39;RetryAttempts&#39;: 0}, &#39;Location&#39;: &#39;/well-logs&#39;}
  2. Traceback (most recent call last):
  3. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py&quot;, line 28, in &lt;module&gt;
  4. s3.upload_file(file_path, bucket_name, filename)
  5. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py&quot;, line 143, in upload_file
  6. return transfer.upload_file(
  7. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py&quot;, line 292, in upload_file
  8. future.result()
  9. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py&quot;, line 103, in result
  10. return self._coordinator.result()
  11. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py&quot;, line 266, in result
  12. raise self._exception
  13. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py&quot;, line 139, in __call__
  14. return self._execute_main(kwargs)
  15. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py&quot;, line 162, in _execute_main
  16. return_value = self._main(**kwargs)
  17. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py&quot;, line 758, in _main
  18. client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  19. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py&quot;, line 530, in _api_call
  20. return self._make_api_call(operation_name, kwargs)
  21. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py&quot;, line 933, in _make_api_call
  22. handler, event_response = self.meta.events.emit_until_response(
  23. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 416, in emit_until_response
  24. return self._emitter.emit_until_response(aliased_event_name, **kwargs)
  25. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 271, in emit_until_response
  26. responses = self._emit(event_name, kwargs, stop_on_response=True)
  27. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 239, in _emit
  28. response = handler(**kwargs)
  29. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3088, in conditionally_calculate_md5
  30. md5_digest = calculate_md5(body, **kwargs)
  31. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3055, in calculate_md5
  32. binary_md5 = _calculate_md5_from_file(body)
  33. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3067, in _calculate_md5_from_file
  34. for chunk in iter(lambda: fileobj.read(1024 * 1024), b&#39;&#39;):
  35. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3067, in &lt;lambda&gt;
  36. for chunk in iter(lambda: fileobj.read(1024 * 1024), b&#39;&#39;):
  37. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 511, in read
  38. data = self._fileobj.read(amount_to_read)
  39. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py&quot;, line 91, in read
  40. return self._fileobj.read(amount)
  41. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 370, in read
  42. self._open_if_needed()
  43. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 361, in _open_if_needed
  44. self._fileobj = self._open_function(self._filename, self._mode)
  45. File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 272, in open
  46. return open(filename, mode)
  47. PermissionError: [Errno 13] Permission denied: &#39;./aws_s3/archive&#39;

答案1

得分: 0

由约翰·戈登在评论中回答。

这是因为 ./aws_s3/archive 是一个目录,但你的代码将 ./aws_s3/ 中的所有内容视为文件。

解决方法:

  1. 移除子目录
  2. 如果确实想要上传子目录中的文件,通常的做法是使用 os.walk()。
英文:

As answered by John Gordon in the comments.

It's because ./aws_s3/archive is a directory, but your code is treating everything it finds in ./aws_s3/ as a file.

Solutions:

  1. Remove subdirectories
  2. If you do want to upload files in subdirectories, the usual approach is to use os.walk().

huangapple
  • 本文由 发表于 2023年6月5日 05:42:11
  • 转载请务必保留本文链接:https://go.coder-hub.com/76402522.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定