英文:
vscode PermissionError: [Errno 13] Permission denied:
问题
这个错误似乎随处可见,但我找不到一个适合我的情况的解决方案。曾经可以访问该文件夹,即使删除了代码中的该部分后,该文件夹仍然出现错误。已尝试重新启动、以管理员身份运行VS,并确认用户对文件夹具有完全访问权限。我必须在某个时候对代码进行了一些更改,导致它出现了问题,但一切看起来都是正确的。为什么它会调用代码中没有的文件夹?在我导入的secrets中也没有引用。
错误信息:
PermissionError: [Errno 13] Permission denied: './aws_s3/archive'
代码:
import boto3
import os
import ignore.secrets as secrets
import shutil
# 建立连接
s3 = boto3.client(
's3',
region_name='us-east-1',
aws_access_key_id=secrets.AWS_ACCESS_KEY_ID,
aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
)
# 创建存储桶
bucket_name = 'well-logs'
response = s3.create_bucket(
Bucket=bucket_name
)
print(response)
# 上传文件
csv_directory = './aws_s3/'
for filename in os.listdir(csv_directory):
file_path = os.path.join(csv_directory, filename)
s3.upload_file(file_path, bucket_name, filename)
# 确认上传成功
response = s3.list_objects(Bucket=bucket_name)
objects = response.get('Contents', [])
if objects:
for obj in objects:
print(obj['Key'])
else:
print("Bucket中没有找到对象。")
已删除的代码部分(即使没有此部分仍然出现错误):
# 存档文件
home_directory = './aws_s3/'
archive_directory = './aws_s3/archive/'
if not os.path.exists(archive_directory):
os.makedirs(archive_directory)
for filename in os.listdir(home_directory):
source_file = os.path.join(home_directory, filename)
destination_folder = archive_directory
shutil.move(source_file, destination_folder)
即使在脚本中删除了存档部分,仍然出现完整的错误:
{'ResponseMetadata': {'RequestId': 'KRXB56SRP0ZSNRXY', 'HostId': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'x-amz-request-id': 'KRXB56SRP0ZSNRXY', 'date': 'Sun, 04 Jun 2023 21:25:51 GMT', 'location': '/well-logs', 'server': 'AmazonS3', 'content-length': '0'}, 'RetryAttempts': 0}, 'Location': '/well-logs'}
Traceback (most recent call last):
File "d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py", line 28, in <module>
s3.upload_file(file_path, bucket_name, filename)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py", line 143, in upload_file
return transfer.upload_file(
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py", line 292, in upload_file
future.result()
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 103, in result
return self._coordinator.result()
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 266, in result
raise self._exception
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 139, in __call__
return self._execute_main(kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 162, in _execute_main
return_value = self._main(**kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py", line 758, in _main
client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 530, in _api_call
return this._make_api_call(operation_name, kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 933, in _make_api_call
handler, event_response = this.meta.events.emit_until_response(
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 416, in emit_until_response
return this._emitter.emit_until_response(aliased_event_name, **kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 271, in emit_until_response
responses are emitted.
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 239, in _emit
response = handler(**kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3088, in conditionally_calculate_md5
md5_digest = calculate_md5(body, **kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line
英文:
This error seems to be everywhere but I can't find a solution that matches my situation. At one point it could access the folder. Further, I am still getting the error for the folder even when that section of the code was removed. Restarted, ran vs as admin, and confirmed user has full access to the folders. I must have made some changes to the code to break it at some point but everything looks correct. Why is it even calling a folder that's not in the code?! There is no reference in the secrets that I imported either.
error:
PermissionError: [Errno 13] Permission denied: './aws_s3/archive'
code:
import boto3
import os
import ignore.secrets as secrets
import shutil
# establish a connection
s3 = boto3.client(
's3',
region_name='us-east-1',
aws_access_key_id=secrets.AWS_ACCESS_KEY_ID,
aws_secret_access_key=secrets.AWS_SECRET_ACCESS_KEY
)
#Create bucket
bucket_name = 'well-logs'
response = s3.create_bucket(
Bucket=bucket_name
)
print(response)
# upload files
csv_directory = './aws_s3/'
for filename in os.listdir(csv_directory):
file_path = os.path.join(csv_directory, filename)
s3.upload_file(file_path, bucket_name, filename)
# Confirm successful upload
response = s3.list_objects(Bucket=bucket_name)
objects = response.get('Contents', [])
if objects:
for obj in objects:
print(obj['Key'])
else:
print("No objects found in the bucket.")
removed section of code (still getting errors without this):
# Archive files
home_directory = './aws_s3/'
archive_directory = './aws_s3/archive/'
if not os.path.exists(archive_directory):
os.makedirs(archive_directory)
for filename in os.listdir(home_directory):
source_file = os.path.join(home_directory, filename)
destination_folder = archive_directory
shutil.move(source_file, destination_folder)
The full error even without the archive section in the script:
{'ResponseMetadata': {'RequestId': 'KRXB56SRP0ZSNRXY', 'HostId': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 'dqLZ6yY8wcwX8zVIIhcemHwnATlu07hxmvs9EyChi/J5airBjle5mlHcD5rBd/VIo/0WqH5+n1Y=', 'x-amz-request-id': 'KRXB56SRP0ZSNRXY', 'date': 'Sun, 04 Jun 2023 21:25:51 GMT', 'location': '/well-logs', 'server': 'AmazonS3', 'content-length': '0'}, 'RetryAttempts': 0}, 'Location': '/well-logs'}
Traceback (most recent call last):
File "d:\ds_projects\dcj-manufacturing-oil-rig\upload-aws.py", line 28, in <module>
s3.upload_file(file_path, bucket_name, filename)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\inject.py", line 143, in upload_file
return transfer.upload_file(
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\boto3\s3\transfer.py", line 292, in upload_file
future.result()
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 103, in result
return self._coordinator.result()
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\futures.py", line 266, in result
raise self._exception
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 139, in __call__
return self._execute_main(kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\tasks.py", line 162, in _execute_main
return_value = self._main(**kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py", line 758, in _main
client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 530, in _api_call
return self._make_api_call(operation_name, kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\client.py", line 933, in _make_api_call
handler, event_response = self.meta.events.emit_until_response(
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 416, in emit_until_response
return self._emitter.emit_until_response(aliased_event_name, **kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 271, in emit_until_response
responses = self._emit(event_name, kwargs, stop_on_response=True)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\hooks.py", line 239, in _emit
response = handler(**kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3088, in conditionally_calculate_md5
md5_digest = calculate_md5(body, **kwargs)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3055, in calculate_md5
binary_md5 = _calculate_md5_from_file(body)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3067, in _calculate_md5_from_file
for chunk in iter(lambda: fileobj.read(1024 * 1024), b''):
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\botocore\utils.py", line 3067, in <lambda>
for chunk in iter(lambda: fileobj.read(1024 * 1024), b''):
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py", line 511, in read
data = self._fileobj.read(amount_to_read)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\upload.py", line 91, in read
return self._fileobj.read(amount)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py", line 370, in read
self._open_if_needed()
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py", line 361, in _open_if_needed
self._fileobj = self._open_function(self._filename, self._mode)
File "d:\ds_projects\dcj-manufacturing-oil-rig\env\lib\site-packages\s3transfer\utils.py", line 272, in open
return open(filename, mode)
PermissionError: [Errno 13] Permission denied: './aws_s3/archive'
答案1
得分: 0
由约翰·戈登在评论中回答。
这是因为 ./aws_s3/archive 是一个目录,但你的代码将 ./aws_s3/ 中的所有内容视为文件。
解决方法:
- 移除子目录
- 如果确实想要上传子目录中的文件,通常的做法是使用 os.walk()。
英文:
As answered by John Gordon in the comments.
It's because ./aws_s3/archive is a directory, but your code is treating everything it finds in ./aws_s3/ as a file.
Solutions:
- Remove subdirectories
- If you do want to upload files in subdirectories, the usual approach is to use os.walk().
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论