英文:
pg_restore: error: error returned by PQputCopyData when restoring database to heroku
问题
我有我的本地数据库,并创建了一个大小为1.1GB的转储文件,我需要将这些数据放到Heroku上,但每次运行以下命令时:
pg_restore --verbose --clean --no-acl --no-owner -h heroku-host -U db-heroku-username -d db-name dumpfile
我的互联网连接速度变慢,几分钟后出现以下错误:
pg_restore: error: error returned by PQputCopyData: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
我不知道原因,我怀疑是我的互联网连接太弱以传输数据,但脑海中没有其他想法。此外,我发现在Heroku网站上行数保持不变,约为16.9k,但数据大小发生了变化。
这是错误之前的最后一行:
pg_restore: processing data for table "public.stock_app_stock"
我尝试更改命令,并制作了一些具有不同配置的转储文件,但问题始终相同。
英文:
I have my local database and I created a dump file that is 1.1GB I need that data to put on Heroku but every time I run this command
pg_restore --verbose --clean --no-acl --no-owner -h heroku-host -U db-heroku-username -d db-name dumpfile
my internet connection slows down and after few minutes I get following error
pg_restore: error: error returned by PQputCopyData: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
I don't know the reason I suspect that my internet connection is too weak to transfer data but nothing else comes to mind. Also, I saw that on the Heroku site number of rows stays the same about 16.9k but the size of the data changes.
This is the last line before the error:
pg_restore: processing data for table "public.stock_app_stock"
I tried changing commands and also made a few dump files with different configurations but the problem is the same always.
答案1
得分: 0
问题是数据库很大,而互联网连接不够稳定,无法处理我的数据库与Heroku PostgreSQL数据库之间的数据传输。当我将Postgres数据库升级为standard0时也有帮助。
英文:
The problem was that database was big and the internet connection was not good to handle the data transfer between my DB and Heroku PostgreSQL DB also it helped when I upgraded Postgres DB to standard0
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论