vscode PermissionError: [Errno 13] Permission denied:

huangapple go评论104阅读模式
英文:

vscode PermissionError: [Errno 13] Permission denied:

问题

这个错误似乎随处可见,但我找不到一个适合我的情况的解决方案。曾经可以访问该文件夹,即使删除了代码中的该部分后,该文件夹仍然出现错误。已尝试重新启动、以管理员身份运行VS,并确认用户对文件夹具有完全访问权限。我必须在某个时候对代码进行了一些更改,导致它出现了问题,但一切看起来都是正确的。为什么它会调用代码中没有的文件夹?在我导入的secrets中也没有引用。

错误信息:

PermissionError: [Errno 13] Permission denied: './aws_s3/archive'

代码:

import boto3
import os
import ignore.secrets as secrets
import shutil


# 建立连接
s3 = boto3.client(
    's3', 
    region_name='us-east-1', 
    aws_access_key_id=secrets.AWS_ACCESS_KEY_ID, 
    aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
)

# 创建存储桶
bucket_name = 'well-logs'

response = s3.create_bucket(
    Bucket=bucket_name
)

print(response)

# 上传文件
csv_directory = './aws_s3/'
for filename in os.listdir(csv_directory):
    file_path = os.path.join(csv_directory, filename)
    s3.upload_file(file_path, bucket_name, filename)

# 确认上传成功
response = s3.list_objects(Bucket=bucket_name)
objects = response.get('Contents', [])
if objects:
    for obj in objects:
        print(obj['Key'])
else:
    print("Bucket中没有找到对象。")

已删除的代码部分(即使没有此部分仍然出现错误):

# 存档文件
home_directory = './aws_s3/'
archive_directory = './aws_s3/archive/'

if not os.path.exists(archive_directory):
    os.makedirs(archive_directory)

for filename in os.listdir(home_directory):
    source_file = os.path.join(home_directory, filename)
    destination_folder = archive_directory
    shutil.move(source_file, destination_folder)

即使在脚本中删除了存档部分,仍然出现完整的错误:

{'ResponseMetadata': {'RequestId': 'KRXB56SRP0ZSNRXY', 'HostId': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'x-amz-request-id': 'KRXB56SRP0ZSNRXY', 'date': 'Sun, 04 Jun 2023 21:25:51 GMT', 'location': '/well-logs', 'server': 'AmazonS3', 'content-length': '0'}, 'RetryAttempts': 0}, 'Location': '/well-logs'}
Traceback (most recent call last):
  File "d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py", line 28, in <module>
    s3.upload_file(file_path, bucket_name, filename)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py", line 143, in upload_file
    return transfer.upload_file(
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py", line 292, in upload_file
    future.result()
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 103, in result
    return self._coordinator.result()
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 266, in result
    raise self._exception
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 139, in __call__
    return self._execute_main(kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 162, in _execute_main
    return_value = self._main(**kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py", line 758, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 530, in _api_call
    return this._make_api_call(operation_name, kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 933, in _make_api_call
    handler, event_response = this.meta.events.emit_until_response(
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 416, in emit_until_response
    return this._emitter.emit_until_response(aliased_event_name, **kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 271, in emit_until_response
    responses are emitted.
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 239, in _emit
    response = handler(**kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3088, in conditionally_calculate_md5
    md5_digest = calculate_md5(body, **kwargs)
  File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line
英文:

This error seems to be everywhere but I can't find a solution that matches my situation. At one point it could access the folder. Further, I am still getting the error for the folder even when that section of the code was removed. Restarted, ran vs as admin, and confirmed user has full access to the folders. I must have made some changes to the code to break it at some point but everything looks correct. Why is it even calling a folder that's not in the code?! There is no reference in the secrets that I imported either.

error:

PermissionError: [Errno 13] Permission denied: &#39;./aws_s3/archive&#39;

code:

import boto3
import os
import ignore.secrets as secrets
import shutil


# establish a connection
s3 = boto3.client(
    &#39;s3&#39;, 
    region_name=&#39;us-east-1&#39;, 
    aws_access_key_id=secrets.AWS_ACCESS_KEY_ID, 
    aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
)

#Create bucket
bucket_name = &#39;well-logs&#39;

response = s3.create_bucket(
    Bucket=bucket_name
)

print(response)

# upload files
csv_directory = &#39;./aws_s3/&#39;
for filename in os.listdir(csv_directory):
    file_path = os.path.join(csv_directory, filename)
    s3.upload_file(file_path, bucket_name, filename)

# Confirm successful upload
response = s3.list_objects(Bucket=bucket_name)
objects = response.get(&#39;Contents&#39;, [])
if objects:
    for obj in objects:
        print(obj[&#39;Key&#39;])
else:
    print(&quot;No objects found in the bucket.&quot;)

removed section of code (still getting errors without this):

# Archive files
home_directory = &#39;./aws_s3/&#39;
archive_directory = &#39;./aws_s3/archive/&#39;

if not os.path.exists(archive_directory):
    os.makedirs(archive_directory)

for filename in os.listdir(home_directory):
    source_file = os.path.join(home_directory, filename)
    destination_folder = archive_directory
    shutil.move(source_file, destination_folder)

The full error even without the archive section in the script:

{&#39;ResponseMetadata&#39;: {&#39;RequestId&#39;: &#39;KRXB56SRP0ZSNRXY&#39;, &#39;HostId&#39;: &#39;dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=&#39;, &#39;HTTPStatusCode&#39;: 200, &#39;HTTPHeaders&#39;: {&#39;x-amz-id-2&#39;: &#39;dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=&#39;, &#39;x-amz-request-id&#39;: &#39;KRXB56SRP0ZSNRXY&#39;, &#39;date&#39;: &#39;Sun, 04 Jun 2023 21:25:51 GMT&#39;, &#39;location&#39;: &#39;/well-logs&#39;, &#39;server&#39;: &#39;AmazonS3&#39;, &#39;content-length&#39;: &#39;0&#39;}, &#39;RetryAttempts&#39;: 0}, &#39;Location&#39;: &#39;/well-logs&#39;}
Traceback (most recent call last):
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py&quot;, line 28, in &lt;module&gt;
    s3.upload_file(file_path, bucket_name, filename)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py&quot;, line 143, in upload_file
    return transfer.upload_file(
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py&quot;, line 292, in upload_file
    future.result()
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py&quot;, line 103, in result
    return self._coordinator.result()
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py&quot;, line 266, in result
    raise self._exception
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py&quot;, line 139, in __call__
    return self._execute_main(kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py&quot;, line 162, in _execute_main
    return_value = self._main(**kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py&quot;, line 758, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py&quot;, line 530, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py&quot;, line 933, in _make_api_call
    handler, event_response = self.meta.events.emit_until_response(
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 416, in emit_until_response
    return self._emitter.emit_until_response(aliased_event_name, **kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 271, in emit_until_response
    responses = self._emit(event_name, kwargs, stop_on_response=True)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py&quot;, line 239, in _emit
    response = handler(**kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3088, in conditionally_calculate_md5
    md5_digest = calculate_md5(body, **kwargs)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3055, in calculate_md5
    binary_md5 = _calculate_md5_from_file(body)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3067, in _calculate_md5_from_file
    for chunk in iter(lambda: fileobj.read(1024 * 1024), b&#39;&#39;):
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py&quot;, line 3067, in &lt;lambda&gt;
    for chunk in iter(lambda: fileobj.read(1024 * 1024), b&#39;&#39;):
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 511, in read
    data = self._fileobj.read(amount_to_read)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py&quot;, line 91, in read
    return self._fileobj.read(amount)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 370, in read
    self._open_if_needed()
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 361, in _open_if_needed
    self._fileobj = self._open_function(self._filename, self._mode)
  File &quot;d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py&quot;, line 272, in open
    return open(filename, mode)
PermissionError: [Errno 13] Permission denied: &#39;./aws_s3/archive&#39;

答案1

得分: 0

由约翰·戈登在评论中回答。

这是因为 ./aws_s3/archive 是一个目录,但你的代码将 ./aws_s3/ 中的所有内容视为文件。

解决方法:

  1. 移除子目录
  2. 如果确实想要上传子目录中的文件,通常的做法是使用 os.walk()。
英文:

As answered by John Gordon in the comments.

It's because ./aws_s3/archive is a directory, but your code is treating everything it finds in ./aws_s3/ as a file.

Solutions:

  1. Remove subdirectories
  2. If you do want to upload files in subdirectories, the usual approach is to use os.walk().

huangapple
  • 本文由 发表于 2023年6月5日 05:42:11
  • 转载请务必保留本文链接:https://go.coder-hub.com/76402522.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定