<Blob: BUCKETNAME, PATH, None> could not be converted to unicode python copying data between gcs buckets

huangapple go评论63阅读模式
英文:

<Blob: BUCKETNAME, PATH, None> could not be converted to unicode python copying data between gcs buckets

问题

I'm trying to copy a file from one GCS bucket to another bucket using Python. I'm not able to copy the file, and it throws the below error.

错误:

遇到了错误
<Blob: [[BUCKET]], [[PATH]], None> 无法转换为Unicode
Traceback (most recent call last):
  File "/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/storage/bucket.py", line 1901, in copy_blob
    new_blob = Blob(bucket=destination_bucket, name=new_name)
  File "/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/storage/blob.py", line 216, in __init__
    name = _bytes_to_unicode(name)
  File "/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/_helpers/__init__.py", line 352, in _bytes_to_unicode
    raise ValueError("%r 无法转换为Unicode" % (value,))
ValueError: <Blob: [[BUCKET]], [[PATH]], None> 无法转换为Unicode

代码:

from google.cloud import storage
from google.oauth2 import service_account
from google.cloud import storage

credentials = service_account.Credentials.from_service_account_file("FILE.json")
client = storage.Client(project='PROJECT', credentials=credentials)

print(client)

source_bucket = client.bucket(str(source_bucket))
source_blob = source_bucket.blob(str(source_blob))
destination_bucket = client.bucket(str(destination_bucket))
destination_blob_name = source_bucket.blob(str(destination_blob_name))

blob_copy = source_bucket.copy_blob(source_blob, destination_bucket, destination_blob_name)

我的Python中的Google包版本:

google-api-core          2.11.0
google-auth              2.16.0
google-cloud-bigquery    3.4.1
google-cloud-core        2.3.2
google-cloud-storage     2.7.0
google-crc32c            1.5.0
google-resumable-media   2.4.0
googleapis-common-protos 1.58.0

我不知道出了什么问题。为什么会出现ValueError错误。我遵循了Google的文档。

英文:

I'm trying to copy a file from one gcs bucket to another bucket using python .
I'm not able to copy the file, it throws the below error.

Error:

An error was encountered:
&lt;Blob:[[BUCKET]], [[PATH]], None&gt; could not be converted to unicode
Traceback (most recent call last):
  File &quot;/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/storage/bucket.py&quot;, line 1901, in copy_blob
    new_blob = Blob(bucket=destination_bucket, name=new_name)
  File &quot;/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/storage/blob.py&quot;, line 216, in __init__
    name = _bytes_to_unicode(name)
  File &quot;/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1679321656981_0001/container_e01_1679321656981_0001_01_000001/mllibs/lib/python3.7/site-packages/google/cloud/_helpers/__init__.py&quot;, line 352, in _bytes_to_unicode
    raise ValueError(&quot;%r could not be converted to unicode&quot; % (value,))
ValueError: &lt;Blob: [[BUCKET]], [[PATH]], None&gt; could not be converted to unicode

The code:

from google.cloud import storage
from google.oauth2 import service_account
from google.cloud import storage

credentials = service_account.Credentials.from_service_account_file(&quot;FILE.json&quot;)
client = storage.Client(project=&#39;PROJECT&#39;, credentials=credentials)

print(client)

source_bucket = client.bucket(str(source_bucket))
source_blob = source_bucket.blob(str(source_blob))
destination_bucket = client.bucket(str(destination_bucket))
destination_blob_name=source_bucket.blob(str(destination_blob_name))

blob_copy = source_bucket.copy_blob(source_blob, destination_bucket, destination_blob_name)

My google package versions for python :

google-api-core          2.11.0
google-auth              2.16.0
google-cloud-bigquery    3.4.1
google-cloud-core        2.3.2
google-cloud-storage     2.7.0
google-crc32c            1.5.0
google-resumable-media   2.4.0
googleapis-common-protos 1.58.0

I have no idea what is wrong. Why the ValueError is coming. I followed the documentation from google.

答案1

得分: 2

The argument destination_blob_name is supposed to be a string.

So you need to change

destination_blob_name=source_bucket.blob(str(destination_blob_name))

to

destination_blob_name = str(destination_blob_name)
英文:

The argument destination_blob_name is supposed to be a string.

So you need to change

destination_blob_name=source_bucket.blob(str(destination_blob_name))

to

destination_blob_name = str(destination_blob_name)

huangapple
  • 本文由 发表于 2023年3月21日 00:53:08
  • 转载请务必保留本文链接:https://go.coder-hub.com/75793145.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定