英文:
easiest way to copy data from a VM instance in GCP?
问题
我有一个拥有60GB的虚拟机,已使用了45GB。它有多个用户。
我已删除旧用户,现在已使用约40GB。我想下载所有数据到我的计算机。如何做到这一点?我唯一知道的方法是使用 scp
命令,类似于这样:gcloud compute scp --recurse <$USER>@<INSTANCE>:/home/$USER ~/$LOCAL_DIR
但速度很慢,甚至可能需要几天。有没有更好更简单的下载数据的方法?
英文:
I have a VM with 60GB and 45GB is used. It has multiple users.
I have deleted old users and now I have about 40GB used. I want to download all the data to my machine. How do I do that? The only way I know is with scp
the command is something like this one: gcloud compute scp --recurse <$USER>@<INSTANCE>:/home/$USER ~/$LOCAL_DIR
but it's very slow and might even take a couple of days. Is there a better and easier way to download the data?
答案1
得分: 3
代码部分不需要翻译,以下是已翻译的内容:
"这里的瓶颈似乎在于你正尝试通过SSH下载大量数据,而SSH下载速度相对较慢。
你可以尝试加快这一过程的方法是将下载过程分成两部分:
- 将虚拟机中的内容上传到云存储桶(Cloud Storage Bucket)。
- 从云存储桶下载内容到你的本地机器。
因此,在这种情况下,你可以从虚拟机执行以下操作:
gsutil cp -R /home/$USER gs://BUCKET_NAME
然后从你的本地机器执行:
gsutil cp -R gs://BUCKET_NAME ~/$LOCAL_DIR
Gsutil使用并行复合上传来加快上传大文件的速度,如果有的话。在第一步中,你将进行GCP(Google Cloud Platform)之间的通信,这将比直接通过SSH下载更快。"
英文:
The bottleneck here seems to be that you're trying to download a considerable amount of data over SSH which is quite slow.
One thing you can try to speed up the process is break down the download into two parts:
- Upload the content of your VM to Cloud Storage Bucket
- Download the content from Cloud Storage to your machine
So in that case, from the VM you'll execute:
gsutil cp -R /home/$USER gs://BUCKET_NAME
And then from your local machine:
gsutil cp -R gs://BUCKET_NAME ~/$LOCAL_DIR
Gsutil uses parallel composite uploads to speed up uploading large files, in case you have any. And in the first step you'll be doing GCP <-> GCP communication which will be faster than downloading from SSH directly.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论