You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I suspect that under certain conditions the Upsert method creates duplicate records.
I have tried reproducing the issue, but I couldn't manage yet - I will try more and update here with findings.
What I think is happening: When badgerdb starts compressing the data or compresses the data, the Upserts end up creating duplicate records. A simple badgerhold.Delete(key,Record) deletes all the duplicates.
I have a multi go-routine app that runs as a long standing process and that keeps the badger db connection open, I noticed for a while that sometimes I get the same record twice with only one field modified.
I have observed this behavior over a couple of months while developing the app but I couldn't make a reproducer of the issue.
(I have tried Upserting 1000 records from multiple nested go routines 1000 times but there were no duplicates in the end). In my project I ended up just calling Delete(key,Record) before doing an Upsert. My keys always have custom types of the form type MyType uuid.UUID.
An example of the struct I upsert:
import "github.com/google/uuid"
type RootID uuid.UUID
func (id RootID) Compare(v interface{}) (int, error) {
vID, ok := v.(RootID)
if !ok {
return 0, fmt.Errorf("invalid interface")
}
if id == vID {
return 0, nil
}
return -1, nil
}
type Root struct {
ID RootID `badgerhold:"key"`
Username string
// Other fields that do not have badgerhold annotations but have primitive type.
}
I think I need to let my attempt of reproducing the issue program run a bit or trigger a compression from badgerdb.
The text was updated successfully, but these errors were encountered:
I know badger has some "less consistent" options that can be passed in, so I'd be careful using any of those.
Also Badger runs in a "snapshot" isolation mode, where readers read the previously committed value during open transactions, so if you are doing some sort of check for duplicates before inserting new ones, you can absolutely generate duplicates.
Hello,
I suspect that under certain conditions the Upsert method creates duplicate records.
I have tried reproducing the issue, but I couldn't manage yet - I will try more and update here with findings.
What I think is happening: When badgerdb starts compressing the data or compresses the data, the Upserts end up creating duplicate records. A simple badgerhold.Delete(key,Record) deletes all the duplicates.
I have a multi go-routine app that runs as a long standing process and that keeps the badger db connection open, I noticed for a while that sometimes I get the same record twice with only one field modified.
The database options are the following:
I have observed this behavior over a couple of months while developing the app but I couldn't make a reproducer of the issue.
(I have tried Upserting 1000 records from multiple nested go routines 1000 times but there were no duplicates in the end). In my project I ended up just calling Delete(key,Record) before doing an Upsert. My keys always have custom types of the form
type MyType uuid.UUID
.An example of the struct I upsert:
I think I need to let my attempt of reproducing the issue program run a bit or trigger a compression from badgerdb.
The text was updated successfully, but these errors were encountered: