We couldn't find a good ODM for MongoDB written in Go, so we made one. Bongo is a wrapper for mgo (https://github.com/go-mgo/mgo) that adds ODM, hooks, validation, and cascade support to its raw Mongo functions.
Bongo is tested using the fantasic GoConvey (https://github.com/smartystreets/goconvey)
Since we're not yet at a major release, some things in the API might change. Here's a list:
- Save - stable
- Find/FindOne/FindById - stable
- Delete - stable
- Save/Delete/Find/Validation hooks - stable
- Cascade - unstable (might need a refactor)
- Change Tracking - stable
- Validation methods - stable
go get github.com/go-bongo/bongo
import "github.com/go-bongo/bongo"
And install dependencies:
cd $GOHOME/src/github.com/go-bongo/bongo && go get .
Create a new bongo.Config
instance:
config := &bongo.Config{
ConnectionString: "localhost",
Database: "bongotest",
}
Then just create a new instance of bongo.Connection
, and make sure to handle any connection errors:
connection, err := bongo.Connect(config)
if err != nil {
log.Fatal(err)
}
If you need to, you can access the raw mgo
session with connection.Session
Any struct can be used as a document as long as it satisfies the Document
interface (SetId(bson.ObjectId)
, GetId() bson.ObjectId
). We recommend that you use the DocumentBase
provided with Bongo, which implements that interface as well as the NewTracker
, TimeCreatedTracker
and TimeModifiedTracker
interfaces (to keep track of new/existing documents and created/modified timestamps). If you use the DocumentBase
or something similar, make sure you use bson:",inline"
otherwise you will get nested behavior when the data goes to your database.
For example:
type Person struct {
bongo.DocumentBase `bson:",inline"`
FirstName string
LastName string
Gender string
}
You can use child structs as well.
type Person struct {
bongo.DocumentBase `bson:",inline"`
FirstName string
LastName string
Gender string
HomeAddress struct {
Street string
Suite string
City string
State string
Zip string
}
}
You can add special methods to your document type that will automatically get called by bongo during certain actions. Hooks get passed the current *bongo.Collection
so you can avoid having to couple them with your actual database layer. Currently available hooks are:
func (s *ModelStruct) Validate(*bongo.Collection) []error
(returns a slice of errors - if it is empty then it is assumed that validation succeeded)func (s *ModelStruct) BeforeSave(*bongo.Collection) error
func (s *ModelStruct) AfterSave(*bongo.Collection) error
func (s *ModelStruct) BeforeDelete(*bongo.Collection) error
func (s *ModelStruct) AfterDelete(*bongo.Collection) error
func (s *ModelStruct) AfterFind(*bongo.Collection) error
Just call save
on a collection instance.
myPerson := &Person{
FirstName:"Testy",
LastName:"McGee",
Gender:"male",
}
err := connection.Collection("people").Save(myPerson)
Now you'll have a new document in the people
collection. If there is an error, you can check if it is a validation error using a type assertion:
if vErr, ok := err.(*bongo.ValidationError); ok {
fmt.Println("Validation errors are:", vErr.Errors)
} else {
fmt.Println("Got a real error:", err.Error())
}
There are three ways to delete a document.
Same thing as Save
- just call DeleteDocument
on the collection and pass the document instance.
err := connection.Collection("people").DeleteDocument(person)
This will run the BeforeDelete
and AfterDelete
hooks, if applicable.
This just delegates to mgo.Collection.Remove
. It will not run the BeforeDelete
and AfterDelete
hooks.
err := connection.Collection("people").DeleteOne(bson.M{"FirstName":"Testy"})
This delegates to mgo.Collection.RemoveAll
. It will not run the BeforeDelete
and AfterDelete
hooks.
changeInfo, err := connection.Collection("people").Delete(bson.M{"FirstName":"Testy"})
fmt.Printf("Deleted %d documents", changeInfo.Removed)
person := &Person{}
err := connection.Collection("people").FindById(bson.ObjectIdHex(StringId), person)
The error returned can be a DocumentNotFoundError
or a more low-level MongoDB error. To check, use a type assertion:
if dnfError, ok := err.(*bongo.DocumentNotFoundError); ok {
fmt.Println("document not found")
} else {
fmt.Println("real error " + err.Error())
}
Finds will return an instance of ResultSet
, which you can then optionally Paginate
and iterate through to get all results.
// *bongo.ResultSet
results := connection.Collection("people").Find(bson.M{"firstName":"Bob"})
person := &Person{}
count := 0
for results.Next(person) {
fmt.Println(person.FirstName)
}
To paginate, you can run Paginate(perPage int, currentPage int)
on the result of connection.Find()
. That will return an instance of bongo.PaginationInfo
, with properties like TotalRecords
, RecordsOnPage
, etc.
To use additional functions like sort
, skip
, limit
, etc, you can access the underlying mgo Query
via ResultSet.Query
.
Same as find, but it will populate the reference of the struct you provide as the second argument.
person := &Person{}
err := connection.Collection("people").FindOne(bson.M{"firstName":"Bob"}, person)
if err != nil {
fmt.Println(err.Error())
} else {
fmt.Println("Found user:", person.FirstName)
}
If your model struct implements the Trackable
interface, it will automatically track changes to your model so you can compare the current values with the original. For example:
type MyModel struct {
bongo.DocumentBase `bson:",inline"`
StringVal string
diffTracker *bongo.DiffTracker
}
// Easy way to lazy load a diff tracker
func (m *MyModel) GetDiffTracker() *DiffTracker {
if m.diffTracker == nil {
m.diffTracker = bongo.NewDiffTracker(m)
}
return m.diffTracker
}
myModel := &MyModel{}
Use as follows:
// Store the current state for comparison
myModel.GetDiffTracker().Reset()
// Change a property...
myModel.StringVal = "foo"
// We know it's been instantiated so no need to use GetDiffTracker()
fmt.Println(myModel.diffTracker.Modified("StringVal")) // true
myModel.diffTracker.Reset()
fmt.Println(myModel.diffTracker.Modified("StringVal")) // false
myModel.StringVal = "foo"
// Store the current state for comparison
myModel.GetDiffTracker().Reset()
isNew, modifiedFields := myModel.GetModified()
fmt.Println(isNew, modifiedFields) // false, ["StringVal"]
myModel.diffTracker.Reset()
isNew, modifiedFields = myModel.GetModified()
fmt.Println(isNew, modifiedFields) // false, []
If you are going to be checking more than one field, you should instantiate a new DiffTrackingSession
with diffTracker.NewSession(useBsonTags bool)
. This will load the changed fields into the session. Otherwise with each call to diffTracker.Modified()
, it will have to recalculate the changed fields.
Bongo supports cascading portions of documents to related documents and the subsequent cleanup upon deletion. For example, if you have a Team
collection, and each team has an array of Players
, you can cascade a player's first name and last name to his or her team.Players
array on save, and remove that element in the array if you delete the player.
To use this feature, your struct needs to have an exported method called GetCascade
, which returns an array of *bongo.CascadeConfig
. Additionally, if you want to make use of the OldQuery
property to remove references from previously related documents, you should probably alsotimplement the DiffTracker
on your model struct (see above).
You can also leave ThroughProp
blank, in which case the properties of the document will be cascaded directly onto the related document. This is useful when you want to cascade ObjectId
properties or other references, but it is important that you keep in mind that these properties will be nullified on the related document when the main doc is deleted or changes references.
Also note that like the above hooks, the GetCascade
method will be passed the instance of the bongo.Collection
so you can keep your models decoupled from your database layer.
type CascadeConfig struct {
// The collection to cascade to
Collection *mgo.Collection
// The relation type (does the target doc have an array of these docs [REL_MANY] or just reference a single doc [REL_ONE])
RelType int
// The property on the related doc to populate
ThroughProp string
// The query to find related docs
Query bson.M
// The data that constructs the query may have changed - this is to remove self from previous relations
OldQuery bson.M
// Properties that will be cascaded/deleted. Can (should) be in dot notation for nested properties. This is used to nullify properties when there is an OldQuery or if the document is deleted.
Properties []string
// The actual data that will be cascade
Data interface{}
}
type ChildRef struct {
Id bson.ObjectId `bson:"_id" json:"_id"`
Name string
}
func (c *Child) GetCascade(collection *bongo.Collection) []*bongo.CascadeConfig {
connection := collection.Connection
rel := &ChildRef {
Id:c.Id,
Name:c.Name,
}
cascadeSingle := &bongo.CascadeConfig{
Collection: connection.Collection("parents").Collection(),
Properties: []string{"name"},
Data:rel,
ThroughProp: "child",
RelType: bongo.REL_ONE,
Query: bson.M{
"_id": c.ParentId,
},
}
cascadeMulti := &bongo.CascadeConfig{
Collection: connection.Collection("parents").Collection(),
Properties: []string{"name"},
Data:rel,
ThroughProp: "children",
RelType: bongo.REL_MANY,
Query: bson.M{
"_id": c.ParentId,
},
}
if c.DiffTracker.Modified("ParentId") {
origId, _ := c.DiffTracker.GetOriginalValue("ParentId")
if origId != nil {
oldQuery := bson.M{
"_id": origId,
}
cascadeSingle.OldQuery = oldQuery
cascadeMulti.OldQuery = oldQuery
}
}
return []*bongo.CascadeConfig{cascadeSingle, cascadeMulti}
}
This does the following:
-
When you save a child, it will populate its parent's (defined by
cascadeSingle.Query
)child
property with an object, consisting of one key/value pair (name
) -
When you save a child, it will also modify its parent's (defined by
cascadeMulti.Query
)children
array, either modifying or pushing to the array of key/value pairs, also with justname
. -
When you delete a child, it will use
cascadeSingle.OldQuery
to remove the reference from its previousparent.child
-
When you delete a child, it will also use
cascadeMulti.OldQuery
to remove the reference from its previousparent.children
Note that the ThroughProp
must be the actual field name in the database (bson tag), not the property name on the struct. If there is no ThroughProp
, the data will be cascaded directly onto the root of the document.