2

I have a list of structs as follows

row = [
   {
      "name":<name1>,
      "age" :<age1>,
      "job" :<job1>
   },
   {
      "name":<name1>,
      "age" :<age1>,
      "job" :<job1>
   },
   {
      "name":<name1>,
      "age" :<age1>,
      "job" :<job1>
   },
   etc...
]

I want to insert this into an SQL table. So far I was running a loop through the array and inserting each row one by one. But is there any other way by which I can insert all the rows with just one query? I know bulk insert, but my understanding is that, for bulk insert, I will have to import data from an external file. I don't want to do that. How do I use the data from this array and perform bulk insert?

1

2 Answers 2

3

You aren't going to be able to do any kind of super-optimized bulk insert without placing a file on the server I don't think.

I am not sure if the db library supports it but using the SQLX extension you can build a single insert statement with named bindvars that go against a struct. You can then pass an array of these structs to a method like NamedExec.

Something like this:

users := []User{
  {
    Name: "alex",
    Email: "[email protected]",
   },
   {
    Name: "muhammed",
    Email: "[email protected]",
   },
}

db.NamedExec("insert into users (NAME, EMAIL) values (:Name, :Email);", users)
Sign up to request clarification or add additional context in comments.

Comments

2
type Person struct {
    Name string
    Age  int
    Job  string
}

func InsertPeople(db *sql.DB, personSlice []*Person) error {
    var queryString = `INSERT INTO "person_table" (
        "name"
        , "age"
        , "job"
    ) VALUES `

    numOfFields := 3
    params := make([]interface{}, len(personSlice)*numOfFields)
    for i, p := range personSlice {
        pos := i * numOfFields
        params[pos+0] = p.Name
        params[pos+1] = p.Age
        params[pos+2] = p.Job

        queryString += `(?, ?, ?),`
    }

    queryString = queryString[:len(queryString)-1] // drop last comma

    _, err := db.Exec(queryString, params...)
    return err
}

3 Comments

Thanks. But this is more of a workaround using go to construct the SQL query to insert multiple rows right? I am looking for a more SQLish way. Possibly using something like bulk insert but without reading from a locally stored file.
Is it possible to create a .txt/.csv like object from the array and use this with a normal SQL BULK INSERT statement? that means, without storing it to local or reading it from local?
@MuhammedRoshan There isn't out-of-the-box support for bulk inserts in the standard library. You'll have to consult the documentation of the driver that you are using to learn whether or not it supports what you are looking for. For example the lib/pq driver for the PostgreSQL database has something like this: pkg.go.dev/github.com/lib/pq#hdr-Bulk_imports. But you are using mysql apparently, and what driver no one knows but you.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.