1

I'm trying to run this command for many tables (one table at a time) in Go 1.9:

COPY (select row_to_json(foo) FROM (SELECT * FROM bar) foo ) TO '/tmp/bar.json';

Is this even possible? It seems with lib/pg, it is not. With go-pg, I keep running out of memory because it buffers it all into memory first.

Doing this from the command prompt works fine. I'd rather use Go's PG libs than have it run it at command prompt.

In short, I'm trying to dump entire tables into JSON in their own files.

Has anyone done this successfully?

Thank you!

EDIT:

Since lb/pg didn't support this at all, I'm using pg-go. Here is the code:

var buf bytes.Buffer
    _, err := db.CopyTo(&buf, "COPY (select row_to_json(foo) FROM (SELECT * FROM bar) r ) TO '/tmp/bar.json'")
    if err != nil {
        panic(err)
    }
5
  • It seems that COPY TO isn't supported yet in lib/pq and jackc/pgx. How did you execute COPY TO in go-pg (query and code snippet) that causing out of memory? Looking at the source code of go-pg, it reads then write the data in chunk (no buffers it all into memory first). Commented Oct 10, 2017 at 22:20
  • Have you tried: \copy (SELECT json_agg(foo) FROM foo) TO '/tmp/foo.json' Posgres >9.20 required. Commented Oct 10, 2017 at 22:43
  • 1
    Sorry for my english but, why are you just dont select row_to_json(foo) FROM (SELECT * FROM bar) foo writing result into file? Commented Oct 11, 2017 at 0:24
  • Updated my question with the code. Commented Oct 11, 2017 at 13:23
  • In response to nk2ge5k: because that would use up too much RAM. Commented Oct 11, 2017 at 15:17

1 Answer 1

4

There are two approaches to export table(s) into json file using COPY.

  1. Using go-pg. To avoid out of memory issue, instead of writing the result to buffer, the result should be written directly to a file. The snippets will be:

    //open database connection first
    db := pg.Connect(&pg.Options{
        User:     "username",
        Password: "password",
        Database: "database",
        Addr:     "192.168.1.2:5432",   //database address
    })
    defer db.Close()
    
    //open output file
    out, err := os.Create("/tmp/bar.json")
    if err != nil {
        panic(err)
    }
    defer out.Close()
    
    //execute opy
    copy := `COPY (SELECT row_to_json(foo) FROM (SELECT * FROM bar) foo ) TO STDOUT`
    _, err = db.CopyTo(out, copy)
    if err != nil {
        panic(err)
    }
    
  2. Using psql and exec package. Basically, this approach execute a psql command in the form of: psql -c query1 -c query2 ... args. Please note, this approach requires that psql is already installed. The snippets:

    queries := []string{
        `SET client_encoding='UTF8'`,
        `\COPY (SELECT row_to_json(foo) FROM (SELECT * FROM bar) foo ) TO '/tmp/bar.json'`,
    }
    dsn := "postgresql://username:[email protected]/database"
    
    //construct arguments
    args := []string{}
    for _, q := range queries {
        args = append(args, "-c", q)
    }
    args = append(args, dsn)
    
    //Execute psql command
    cmd := exec.Command("psql", args...)
    stdoutStderr, err := cmd.CombinedOutput()
    if err != nil {
        panic(err)
    }
    fmt.Printf("%s\n", stdoutStderr)
    

Note

Adjust the value of connection parameters (username, password, host/ip address, etc) as needed. For the detail, please refer to the documentation.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.