-
Notifications
You must be signed in to change notification settings - Fork 22
Querying in Chunks for Big Queries
Vinícius Garcia edited this page Jul 4, 2022
·
1 revision
It's very unusual for us to need to load a number of records from the database that might be too big for fitting in memory, e.g. load all the users and send them somewhere. But it might happen.
For these cases, it's best to load chunks of data at a time so
that we can work on a substantial amount of data at a time and never
overload our memory capacity. For this use-case we have a specific
function called QueryChunks
:
err = db.QueryChunks(ctx, ksql.ChunkParser{
Query: "SELECT * FROM users WHERE type = ?",
Params: []interface{}{usersType},
ChunkSize: 100,
ForEachChunk: func(users []User) error {
err := sendUsersSomewhere(users)
if err != nil {
// This will abort the QueryChunks loop and return this error
return err
}
return nil
},
})
if err != nil {
panic(err.Error())
}
Its signature is more complicated than the other two Query* methods, thus, it is advisable to always prefer using the other two when possible reserving this one for the rare use-cases where you are actually loading big sections of the database into memory.