DotDB is a functional convergent bi-directional streamable distributed database.
This is not quite stable yet.
There is no npm support. Please use the repo link directly:
yarn add esm
yarn add https://github.com/dotchain/dotjs
A single-file output is available via dist/dotdb.js.
DotDB is a functional, convergent, reactive, synchronized store.
Values in DotDB are effectively immutable:
import {expect} from "chai";
import {Text} from "dotjs/db";
it("should not mutate underlying value", ()=> {
const initial = new Text("hello");
const updated = initial.splice(5, 0, " world");
expect(initial.text).to.equal("hello");
expect(updated.text).to.equal("hello world");
});
Values in DotDB converge when mutated by multiple writers. The convergence honors the immutable feel by simply leaving the original value intact but instead making the convergence available via the latest() method.
import {expect} from "chai";
import {Stream, Text} from "dotjs/db";
describe("Convergence", () => {
it("should converge", ()=> {
const initial = new Text("hello").setStream(new Stream());
const updated1 = initial.splice(5, 0, " world");
const updated2 = initial.splice(5, 0, ", goodbye!")
expect(initial.text).to.equal("hello");
expect(updated1.text).to.equal("hello world");
expect(updated2.text).to.equal("hello, goodbye!");
expect(initial.latest().text).equal("hello world, goodbye!")
expect(updated1.latest().text).equal("hello world, goodbye!")
expect(updated2.latest().text).equal("hello world, goodbye!")
});
});
Note: Convergence requires a *stream* associated with the value. In the example, the initial value is setup with a new stream. In practice, this is rarely needed as all derived values simply inherent the *stream* from their parents and the root object is created at app initialization. But the examples here are all standalone and so will include that as part of the constructor.
Values in DotDB are reactive. The example below illustrates fetching a key from a dictionary. This value keeps up with any changes to the dictionary:
import {expect} from "chai";
import {Dict, Stream, Text} from "dotjs/db";
describe("Reactive", () => {
it("fields update with underlying changes", ()=> {
const initial = new Dict({
hello: new Text("world")
}).setStream(new Stream());
const hello = initial.get("hello");
initial.get("hello").replace(new Text("goodbye!"));
expect(hello.latest().text).to.equal("goodbye!");
expect(initial.latest().get("hello").text).to.equal("goodbye!");
});
});
A slightly more involved example uses the field()
function which works on any value. For actual Dict
or a similar, it returns the get()
value and for the rest it returns Null. The reactive part is the fact that it keeps up with the underlying object:
it("fields update even if the underlying object changes", ()=> {
const initial = new Text("hello").setStream(new Stream());
const hello = field(new Store(), initial, new Text("hello"));
expect(hello).to.be.instanceOf(Null);
initial.replace(new Dict({hello: new Text("world")}));
expect(hello.latest().text).to.equal("world")
});
A slightly different reactive behavior is when the field key changes:
it("fields update when the key changes", ()=> {
const initial = new Dict({
hello: new Text("world"),
boo: new Text("hoo")
}).setStream(new Stream());
const key = new Text("hello").setStream(new Stream());
const hello = field(new Store(), initial, key);
expect(hello.text).to.equal("world");
key.replace(new Text("boo"));
expect(hello.latest().text).to.equal("hoo");
});
Standard "Functional Reactive" formulations (such as with ReactiveX/rxjs, Sodium) tend to be about data flow in one direction. DotDB takes a fairly different route attempting to make data flow bi-directionally: most derivations are meant to be meaningfully modified with the mutations being proxied to the underlying values quite transparently:
import {expect} from "chai";
import {field, Dict, Null, Store, Stream, Text} from "dotjs/db";
describe("Bi-directional", () => {
it("proxies edits on fields", () => {
const initial = new Dict({
hello: new Text("world")
}).setStream(new Stream());
const hello = field(new Store(), initial, new Text("hello"));
hello.replace(new Text("goodbye!"));
expect(initial.latest().get("hello").text).to.equal("goodbye!");
});
});
The example above is for the relatively simple case of accessing a field dynamically. A more difficult use case would be working with output of filtering a dictionary:
it("proxies edits on filtered dictionaries", () => {
const initial = new Dict({
hello: new Text("world"),
boo: new Text("goop")
}).setStream(new Stream());
// see examples/bidirectional_test.js for fn definition
const fn = new PrefixMatcher("w");
const filtered = filter(new Store(), initial, fn);
expect(filtered.get("boo")).to.be.instanceOf(Null);
expect(filtered.get("hello").text).to.equal("world");
});
The specific implementations of field
, map
, filter
, group
and other such functions provided by DotDB
all try to have reasonable behavior for mutating the output and proxying those changes upstream. For a given definition of the reactive forward data flow, multiple reverse flows can be defined and all the bi-directional
primitives in DotDB
pin both behaviors. Custom behaviors are possible (though intricate to get right at this point) but the inherent strength of two-way bindings is that the usual complexity of event-handling can be avoided with a fairly declarative setup.
One of the goals here is to build a UI apps where
the complexities event-handling are not used. Instead, much
of that behavior is obtained via simple composition-friendly
two-way bindings instead. For example, one can pass a field
of a dictionary to a text input and simply have the text input
write directly (i.e. changes are applied onto the field of the
dictionary). This approach allows a more declarative approach
to not just rendering but also actual edits/mutations.
All of DotDB uses a demand-driven computation model (Pull FRP). This avoids a lot of the pit-falls of a pure Pull-based system (such as the need for schedulers, ensuring subscriptions are not leaked -- and the cost of computation being based on all defined derivations even if those derivations are not required at the moment).
One of the interesting consequences of this approach is that defining a large number of views incurs almost no cost if they are never "used". Another consequence is the ability to schedule work better since all updates only happen when the latest()
call is initiated. There are other subtle and graceful ways to introduce scheduling priorities without affecting ability to reason about the correctness.
One of the goals of DotDB is to support building a full UI,
using the computation scheme here with support for hidden views
(such as mobile or desktop etc) with no cost for those views
unless they are rendered.
Network synchronization is via a Session
singleton and a root
object (which is typically a Store
instance):
// savedSession is initialized via the root object
const savedSession = Session.serialize(new Store());
const root = Session.connect(url, Session.serialize(new Store()));
// now root is an instance of Store and can be used
The store maintains a top-level collection of Dict
objects, meant for users
, messages
or other typical top-level collections. Once a store is created, the rest of the objects simple have their streams inherited from it and the rest of the code can quite transparently work without consideration of any remote activity.
Synchronization is explicit (in keeping with the immutable feel as well as the explicit control over computation) via push()
:
...
store.push().then(err => {
// do err handling include binary exponential fallback etc.
});
store.pull().then(err => {
// do err handling
})
Push and pull only make a single attempt even in case of success -- the caller is expected to keep things alive by repeated invocations (with appropriate backoff). This approach allows graceful ways for callers to suspend the synchronization as needed.
The whole Store
instance can be persisted (say using LocalStorage) via a call to Session.serialize(store)
and then restored via a call to Session.connect(url, serialized)
.
This effectively creates a snapshot of the session state and restores things to the earlier state (respectively).
Individual values can also be serialized using this approach and these serializations include includes functions and computations -- which perform the role of stored procedures and views in database terminology except that these also show up as strongly typed objects (and so allow programmatic access to fields and can also be transformed as if it were data).
All DotDB values also support git-like branch, push/pull semantics:
import {expect} from "chai";
import {Stream, Text} from "dotjs/db";
describe("Branch", () => {
it("should branch and merge", ()=> {
const parent = new Text("hello").setStream(new Stream());
const child = parent.branch();
const child1 = child.splice(5, 0, ", bye!");
const parent1 = parent.splice(5, 0, " world");
expect(parent.latest().text).to.equal("hello world");
expect(child.latest().text).to.equal("hello, bye!");
child1.push()
expect(parent.latest().text).to.equal("hello world, bye!");
expect(child.latest().text).to.equal("hello, bye!");
child1.pull();
expect(parent.latest().text).to.equal("hello world, bye!");
expect(child.latest().text).to.equal("hello world, bye!");
});
});
All values also support automatic undo/redo with the undo/redo being limited to the current branch.
examples tbd
All values can be streamed
to get the changes made to it, not just the latest value:
const initial = new Text("hello").setStream(new Stream());
initial.splice(0, 1, "H"); // hello => Hello
initial.splice(5, 0, " world.") // Hello => Hello world.
// get next versions
const {change, version} = initial.next;
const {change: c2, version: final} = version.next;
expect(final.text).to.equal("Hello world.");
These changes can also be applied to the initial object to compute the final result. This allows a mechanism for thin clients (such as on mobile devices) to offload expensive views to the cloud but collect those changes and apply them locally.
Some of this remote proxying is relatively easy to implement at this point though the exact API for this is not clear yet.
DotDB has a built-in value Ref(path)
which evaluates to the value at the path
specified. For instance:
import {expect} from "chai";
import {field, Dict, Ref, Store, Text} from "dotjs/db";
describe("Ref", () => {
it("should evaluate references", ()=> {
const store = new Store();
const table1 = store.collection("table1");
const row1 = table1.get("row1").replace(new Dict({
"hello": new Text("world")
}))
const ref = new Ref(["table1", "row1"])
const hello = field(store, ref, new Text("hello"));
expect(hello.latest().text).to.equal("world");
});
});
This allows the ability to represent rich data-structures and single-instancing of values. In addition, any computation based on a ref will automatically update if the ref is changed.
- Node-based tests:
cd db; yarn mocha
orcd db; npm run mocha
. - Node-based tests with code coverage:
cd db; yarn test
orcd db; npm test
. - Browser tests using Karma:
cd db; yarn karma
orcd db; npm run karma
- This uses karma and headless chrome
- Browser-based end-to-end tests:
- Run
cd db; yarn e2e
orcd db; npm run e2e
This runs a js server (via test/e2e/server.js)
- Run