英文:
Scanning large rows into structs in Go
问题
我正在使用一个尚未规范化的数据库,并且这个表包含了超过40个列的记录。
以下是我用于(尝试)将记录扫描到一个大型结构体中的Go代码:
type Coderyte struct {
ID int `json:"id"`
EmrID int `json:"emr_id"`
DftID int `json:"dft_id"`
Fix int `json:"fix"`
ReportDate string `json:"report_date"` // 时间?
Patient string `json:"patient"`
... // 等等
}
func ReadCoderyte(res http.ResponseWriter, req *http.Request) {
rows, err := db.Query("SELECT * FROM coderyte")
if err != nil {
http.Error(res, "查询数据库出错", 500)
}
defer rows.Close()
// 将行转换为Coderyte结构体的切片
coderytes := make([]*Coderyte, 0)
for rows.Next() {
coderyte := new(Coderyte)
err := rows.Scan(&coderyte) // 预期有42个列
if err != nil {
panic(err)
http.Error(res, "转换coderyte对象出错", 500)
}
coderytes = append(coderytes, coderyte)
}
}
当我调用这段代码时,Scan
报错说“预期有42个目标参数,而不是1个”。我理解的是,在扫描调用中,我需要处理这个大型结构体中的每个字段,例如 Scan(&coderyte.ID, &coderyte.EmrID, etc)
。
我的搜索只找到了这个问题 ,其中建议使用 sqlx
。如果不需要第三方工具,我想避免使用它。
我的问题归结为:有没有一种方法可以在不指定每个字段的情况下将大型数据库记录转换为结构体?。
我还应该注意到,这个函数的最终目标是返回一个JSON对象数组,但我没有包含代码的这部分,因为我觉得这不重要。如果有一种方法可以绕过 Scan
并返回JSON,那将是一个受欢迎的答案。
英文:
I'm working with a database which has yet to be normalized, and this table contains records with over 40 columns.
The following is my Go code for (attempting) to scan the records into a large struct:
type Coderyte struct {
ID int `json:"id"`
EmrID int `json:"emr_id"`
DftID int `json:"dft_id"`
Fix int `json:"fix"`
ReportDate string `json:"report_date"` // Time?
Patient string `json:"patient"`
... // etc
}
func ReadCoderyte(res http.ResponseWriter, req *http.Request) {
rows, err := db.Query("SELECT * FROM coderyte")
if err != nil {
http.Error(res, "Error querying database", 500)
}
defer rows.Close()
// Convert rows into a slice of Coderyte structs
coderytes := make([]*Coderyte, 0)
for rows.Next() {
coderyte := new(Coderyte)
err := rows.Scan(&coderyte) // Expected 42 columns
if err != nil {
panic(err)
http.Error(res, "Error converting coderyte object", 500)
}
coderytes = append(coderytes, coderyte)
}
When I call this code, Scan
complains that it "expected 42 destination arguments, not 1". My understanding is that I would need to address every single field in this large struct, inside of the scan call, ie Scan(&coderyte.ID, &coderyte.EmrID, etc)
My searches have only yielded this other question, where the suggested answer is to use sqlx
. I'm trying to avoid using a third-party tool if I don't need it.
My question boils down to: Is there a way to convert a large database record into a struct without specifying every single field?.
I should also note that the ultimate goal of this function is to return a JSON array of objects, but I did not include that part of the code because I feel it is not important. If there is a way to bypass Scan
and return JSON, that would be an appreciated answer as well.
答案1
得分: 2
这个函数的最终目标是返回一个JSON对象数组。
听起来你可以完全绕过结构体,而是扫描到一个map[string]interface{}
中,以更动态的方式完成所有操作:
你可以像这样做:
rows, _ := db.Query("SELECT * FROM coderyte")
cols, _ := rows.Columns()
store := []map[string]interface{}{}
for rows.Next() {
columns := make([]interface{}, len(cols))
columnPointers := make([]interface{}, len(cols))
for i, _ := range columns {
columnPointers[i] = &columns[i]
}
if err := rows.Scan(columnPointers...); err != nil {
return err
}
m := make(map[string]interface{})
for i, colName := range cols {
val := columnPointers[i].(*interface{})
m[colName] = *val
}
store = append(store, m)
}
js, _ := json.Marshal(store)
fmt.Println(string(js))
显然,你也可以将其转换为结构体,因为你可以使用json.Unmarshal
来处理JSON,但考虑到你的用例,这似乎是一个多余的步骤。
js, _ := json.Marshal(store)
structs := []Coderyte{}
json.Unmarshal(js, &structs)
话虽如此,你可能应该直接使用sqlx - 它们可能会做更聪明的事情,并且效率更高。
英文:
> the ultimate goal of this function is to return a JSON array of objects
It sounds like you could byass the struct entirely then, and instead scan into a map[string]interface{}
, and do it all pretty dynamically:
You could do something like this:
rows, _ := db.Query("SELECT * FROM coderyte")
cols, _ := rows.Columns()
store := []map[string]interface{}
for rows.Next() {
columns := make([]interface{}, len(cols))
columnPointers := make([]interface{}, len(cols))
for i, _ := range columns {
columnPointers[i] = &columns[i]
}
if err := rows.Scan(columnPointers...); err != nil {
return err
}
m := make(map[string]interface{})
for i, colName := range cols {
val := columnPointers[i].(*interface{})
m[colName] = *val
}
store = append(store, m)
}
js, _ := json.Marshal(store)
fmt.Println(string(js))
Now, obviously you could also convert it to a struct, since you could take the json and do json.Unmarshal
, but given your use case that seems like a pointless extra step.
js, _ := json.Marshal(store)
structs := []Coderyte{}
json.Unmarshal(js, &structs)
All that being said, you should probably just use sqlx - they probably do way cleverer things and do it way more efficiently.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论