Golang, SQL and ORM avoidance techniques

760 views
Skip to first unread message

bry...@gmail.com

unread,
Jun 2, 2017, 8:55:12 AM6/2/17
to golang-nuts

I've been trying hard (well.. as much as I can considering my lack of in-depth go knowledge or - to be perfectly honest - lack of in-depth knowledge of anything) to find suitable go+sql technique that would not require a lot of code repetition, not use reflection and not use ORMs of any sort... Could somebody please tell me if there's anything particularly wrong with the following:

type ScannerFunc func() []interface{}

func
(db *DB) ScanSome(stmt string, sf ScannerFunc, params ...interface{}) error {
 rows
, err := db.Query(stmt, params...)
 
if err != nil {
 
return err
 
}
 defer rows
.Close()
 
for rows.Next() {
 err
= rows.Scan(sf()...)
 
if err != nil {
 
return err
 
}
 
}
 
if err = rows.Err(); err != nil {
 
return err
 
}
 
return nil
}

Having the above I could then implement the following for each of my 'models' (User being an example below). This could easily be 'go generate'-d for each model


type User struct {
   
UserID  int64
   
Name    string
   
Role    int
   
// (...)
}

func
ScanUsersFunc(users *[]*User) ScannerFunc {
   
return ScannerFunc(func() []interface{}) {
        u
:= User{}
       
*users = append(*users, &u)
       
var r []interface{} = []interface{}{&u.UserID, &u.Name, &u.Role, (more properties)}
       
return r
   
}
}


and finally use it like this: 


const (
    sqlUsersByRole
= "SELECT user_id,name,role, (more if needed) FROM user WHERE role=?"
    sqlAllUsers    
= "SELECT user_id,name,role FROM user"
)

func
(db *DB) UsersByRole(role int) ([]*User, error) {
    users
:= make([]*User, 0)
    err
:= db.ScanSome(sqlUsersByRole, ScanUsersFunc(&users), role)
   
if err != nil {
       
return nil, err
   
}
   
return users, nil
}

func
(db *DB) AllUsers() ([]*User, error) {
    users
:= make([]*User, 0)
    err
:= db.ScanSome(sqlAllUsers, ScanUsersFunc(&users))
   
if err != nil {
       
return nil, err
   
}
   
return users, nil
}


Alternatively (to avoid scanning/returning all results) a callback could be provided to ScanSome and called after each scan.

Obviously I could also implement ScanOne for situations where I only expect one row of results...


So - any obvious issues with the above 'technique'...?


Thanks,

adam



silviu...@gmail.com

unread,
Jun 3, 2017, 7:32:56 AM6/3/17
to golang-nuts
Hi Adam,

Back in 2014 I had the same driving motives like you and I ended up writing my own generator for Postgres (to deal with tables, views, and functions, for the basic types):


It served me well over the past 3-4 years, particularly because I tend to rely a lot on materialized views, that I can refresh via (generated) Go code for complex queries.

If you start your own project, I can tell you it's a large undertaking and will probably end up a work in progress over months and years. 
When mapping database object you will find that some of the code may not feel idiomatic Go, so you will need to use your judgment whether it's appropriate for you individually or for a larger team, where the least common denominator is more important than generated patterns.

Cheers,
Silviu

Niko Schwarz

unread,
Jun 4, 2017, 1:58:18 PM6/4/17
to golang-nuts
There's no great trick, but there are some general ideas you should be aware of.

 - The Google way of storing data: Cloud Spanner (This is great to balance out some denormalization.)

Niko

Ren Thraysk

unread,
Jun 5, 2017, 12:43:33 PM6/5/17
to golang-nuts

Wrote something similar recently.

One difference is that I moved the rows.Scan call into the passed in function.


type Scannable interface {
  Scan(...interface{}) error
}

func scanIntoUser(u *store.User) func(s Scannable) error {
 
return func(Scannable) error {
 
return s.Scan(&u.Id, &u.Name, &u.PasswordHash, &u.Email)
 
}
}

func
(s *userStore) UserByName(name string) (*store.User, error) {
 u
:= &store.User{}
 err
:= ExpectOneRow(s.db, scanIntoUser(u), "CALL spUserByName(?)", name)

 
if err != nil {
 
return nil, err
 
}

 
return u, nil
}


 Idea being gives an opportunity to do some mapping by scanning into local variables and then assign them into the object, for instance dealing with NULLs. Scan into a sql.Null* variable, and modify whatever struct as see appropriate.

Ren

venturestre...@gmail.com

unread,
Jun 6, 2017, 8:46:58 AM6/6/17
to golang-nuts
On a complete tangent, are you calling a stored procedure there? I thought the database/sql package didn't support MySQL stored procedures yet? 

Ren Thraysk

unread,
Jun 6, 2017, 1:13:30 PM6/6/17
to golang-nuts


On Tuesday, 6 June 2017 13:46:58 UTC+1, venturestre...@gmail.com wrote:
On a complete tangent, are you calling a stored procedure there? I thought the database/sql package didn't support MySQL stored procedures yet? 

Haven't encountered a serious issue, as yet. Obviously output parameters, and probably return values aren't supported.

Selects appear to work fine. With simple single row insert statements sql.Result's LastInsertId() only returns 0, though sql.Result's RowsAffected() does return 1.

Ren

Tieson Molly

unread,
Jun 7, 2017, 2:52:49 PM6/7/17
to golang-nuts
I had the same line of thinking regarding avoiding an ORM.   I settled on using sqlx for my project.
Reply all
Reply to author
Forward
0 new messages