crawl-300d-2M-subword.zip 文件损坏或无法下载。

huangapple go评论92阅读模式
英文:

crawl-300d-2M-subword.zip corrupted or cannot be downloaded

问题

我正在尝试在我的Windows机器上使用这个fasttext模型crawl-300d-2M-subword.zip,但在最后几KB处下载失败。

我成功地在我的Ubuntu服务器上使用wget下载了zip文件,但每当我尝试解压缩它时,压缩文件都会损坏。以下是我得到的示例:

unzip crawl-300d-2M-subword.zip
Archive: crawl-300d-2M-subword.zip
  inflating: crawl-300d-2M-subword.vec
  inflating: crawl-300d-2M-subword.bin   bad CRC ff925bde  (should be e9be08f7)

一直是我感兴趣的文件crawl-300d-2M-subword.bin 在解压缩时出现问题。

我尝试了这两种方法多次,但都没有成功。似乎没有人遇到过这个问题。

英文:

I am trying to use this fasttext model crawl-300d-2M-subword.zip from the official page onI my Windows machine, but the download fails by the last few Kb.

I managed to successfully download the zip file into my ubuntu server using wget, but the zipped file is corrupted whenever I try to unzip it. Example of what I am getting:

unzip crawl-300d-2M-subword.zip
Archive:  crawl-300d-2M-subword.zip
  inflating: crawl-300d-2M-subword.vec
  inflating: crawl-300d-2M-subword.bin   bad CRC ff925bde  (should be e9be08f7)

It is always the file crawl-300d-2M-subword.bin, which I am interested in, that has problems in te unzipping.

I tried the two ways many times but with no success. it seems to me no one had this issue before

答案1

得分: 1

I've just downloaded & unzipped that file with no errors, so the problem is likely unique to your system's configuration, tools, or its network-path to the download servers.

One common problem that's sometimes not prominently reported by a tool like wget is a download that keeps ending early, resulting in a truncated local file.

  • Is the zip file you received exactly 681,808,098 bytes long? (That's what I get.)
  • What if you try another download tool instead, like curl? (Such a relay between different endpoints might not trigger the same problems.)

Sometimes if repeated downloads keep failing in the same way, it's due to subtle misconfiguration bugs/corruption unique to the network path from your machine to the peer (download origin) machine.

  • Can you do a successful download of the zip file (of full size per above) to anywhere else?
  • Then, transfer from that secondary location to where you really want it?

If you're having problems on both a Windows machine, and a Ubuntu server, are they both on the same local network, perhaps subject to the same network issues – either bugs, or policies that cut a particular long download short?

英文:

I've just downloaded & unzipped that file with no errors, so the problem is likely unique to your system's configuration, tools, or its network-path to the download servers.

One common problem that's sometimes not prominently reported by a tool like wget is a download that keeps ending early, resulting in a truncated local file.

  • Is the zip file you received exactly 681,808,098 bytes long? (That's what I get.)
  • What if you try another download tool instead, like curl? (Such a relay between different endpoints might not trigger the same problems.)

Sometimes if repeated downloads keep failing in the same way, it's due to subtle misconfiguration bugs/corruption unique to the network path from your machine to the peer (download origin) machine.

  • Can you do a successful download of the zip file (of full size per above) to anywhere else?
  • Then, transfer from that secondary location to where you really want it?

If you're having problems on both a Windows machine, and a Ubuntu server, are they both on the same local network, perhaps subject to the same network issues – either bugs, or policies that cut a particular long download short?

huangapple
  • 本文由 发表于 2023年2月18日 19:18:52
  • 转载请务必保留本文链接:https://go.coder-hub.com/75492977.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定