英文:
import data into PostgreSQL and export to CSV
问题
我正在尝试使用PostgreSQL进行CSV数据的导入和导出(其中数据库位于远程主机)。通常情况下,我会使用psql
命令执行\copy <table> from <local path> ...
和\copy <table> to <local path> ...
,但我需要能够通过Go来执行,而我没有访问shell或没有安装psql的系统。
数据本身预计非常轻量级(可能总共不到2MB的数据),因此我希望库/代码能够推断表的模式并将数据推送到表中。
有关如何实现此功能的任何建议吗?我不确定Go的database/sql
、pgx
或pq
是否允许在不指定列的情况下执行此操作。对此有何建议吗?
编辑:
我最终使用了https://github.com/joho/sqltocsv进行数据库导出,这非常简单,我不需要定义任何模式/结构。
我没有代码,但我尝试了gorm
并意识到我需要定义一些结构/模式。
英文:
I am trying to do csv import and export data with postgresql (where the data base is at a remote host). Normally I would use the psql
command to do \copy <table> from <local path> ...
and \copy <table> to <local path> ...
but I need to be able to do it via Go where I don't have access to shell or systems that don't have psql installed.
The data itself is expected to be pretty light (maybe < 2 MB of data together), hence I am trying not to implement any structs/schema of track the columns in the tables. When importing into DB, I want to library/code to infer the schema of the table and push the data to the tables.
Any suggestions on how to implement this? I am not sure if any of the Go database/sql
or pgx
or pq
allow this without being able to specify columns. Any advice on this this?
Edit:
I ended up using https://github.com/joho/sqltocsv for DB export, which is pretty simple enough where I don't have to define any schema/structs.
I don't have the code but I tried gorm
and realized I need to define some struct/schema for it.
答案1
得分: 3
我找到了使用pgx
包的方法来实现(感谢@Gustavo Kawamoto的建议)。
这是我的导入和导出代码:
package main
import (
"fmt"
"os"
"github.com/jackc/pgx"
)
func main() {
pgxConConfig := pgx.ConnConfig{
Port: 5432,
Host: "remote_host",
Database: "db_name",
User: "my_user",
Password: "my_password",
}
conn, err := pgx.Connect(pgxConConfig)
if err != nil {
panic(err)
}
defer conn.Close()
tables := []string{"table1", "table2", "table3"}
import_dir := "/dir_to_import_from"
export_dir := "/dir_to_export_to"
for _, t := range tables {
f, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", import_dir, t), os.O_RDONLY, 0777)
if err != nil {
return
}
f.Close()
err = importer(conn, f, t)
if err != nil {
break
}
fmt.Println(" Done with import and doing export")
ef, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", export_dir, t), os.O_CREATE|os.O_WRONLY, 0777)
if err != nil {
fmt.Println("error opening file:", err)
return
}
ef.Close()
err = exporter(conn, ef, t)
if err != nil {
break
}
}
}
func importer(conn *pgx.Conn, f *os.File, table string) error {
res, err := conn.CopyFromReader(f, fmt.Sprintf("COPY %s FROM STDIN DELIMITER '|' CSV HEADER", table))
if err != nil {
return err
}
fmt.Println("==> import rows affected:", res.RowsAffected())
return nil
}
func exporter(conn *pgx.Conn, f *os.File, table string) error {
res, err := conn.CopyToWriter(f, fmt.Sprintf("COPY %s TO STDOUT DELIMITER '|' CSV HEADER", table))
if err != nil {
return fmt.Errorf("error exporting file: %+v", err)
}
fmt.Println("==> export rows affected:", res.RowsAffected())
return nil
}
英文:
I found way to do it with pgx
package (thanks to @Gustavo Kawamoto suggestion).
Here's my import and export:
package main
import (
"fmt"
"os"
"github.com/jackc/pgx"
)
func main() {
pgxConConfig := pgx.ConnConfig{
Port: 5432,
Host: "remote_host",
Database: "db_name",
User: "my_user",
Password: "my_password",
}
conn, err := pgx.Connect(pgxConConfig)
if err != nil {
panic(err)
}
defer conn.Close()
tables := []string{"table1", "table2", "table3",}
import_dir := "/dir_to_import_from"
export_dir := "/dir_to_export_to"
for _, t := range tables {
f, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", import_dir, t), os.O_RDONLY, 0777)
if err != nil {
return
}
f.Close()
err = importer(conn, f, t)
if err != nil {
break
}
fmt.Println(" Done with import and doing export")
ef, err := os.OpenFile(fmt.Sprintf("%s/table_%s.csv", export_dir, t), os.O_CREATE|os.O_WRONLY, 0777)
if err != nil {
fmt.Println("error opening file:", err)
return
}
ef.Close()
err = exporter(conn, ef, t)
if err != nil {
break
}
}
}
func importer(conn *pgx.Conn, f *os.File, table string) error {
res, err := conn.CopyFromReader(f, fmt.Sprintf("COPY %s FROM STDIN DELIMITER '|' CSV HEADER", table))
if err != nil {
return err
}
fmt.Println("==> import rows affected:", res.RowsAffected())
return nil
}
func exporter(conn *pgx.Conn, f *os.File, table string) error {
res, err := conn.CopyToWriter(f, fmt.Sprintf("COPY %s TO STDOUT DELIMITER '|' CSV HEADER", table))
if err != nil {
return fmt.Errorf("error exporting file: %+v", err)
}
fmt.Println("==> export rows affected:", res.RowsAffected())
return nil
}
答案2
得分: 2
请查看此软件包:https://github.com/chop-dbhi/sql-importer
- 自动创建表格
- 唯一性和非空检测。
- 支持超过1600列的CSV文件(Postgres的限制)
英文:
Checkout this pkg: https://github.com/chop-dbhi/sql-importer
- Automatic table creation
- Uniqueness and not null detection.
- Support for CSV files wider than 1600 columns (the Postgres limit)
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论