Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide better exception when unable to resolve class while deserializing rulebase #264

Open
EthanEChristian opened this issue Feb 9, 2017 · 2 comments

Comments

@EthanEChristian
Copy link
Contributor

When a rulebase or session is serialized with a newer version of rules that may contain new records, if you try and deserialize the rulebase with older version of the rules file that doesn't contain these classes.

In the case described above while deserializing there is a null pointer thrown when trying to resolve the constructor symbol in the read-record function.

In this scenario i would expect the deserialization to fail, but provide a better exception when the constructor for the missing record.

Typically this sort of error would effect deserializing a session, however can effect the rulebase when a custom-fact-type function is provided and a record is used as the type in a condition for a rule. ex.

(defprotocol IFactType
  (fact-type [x]))

(defprotocol IAncestors
  (custom-ancestors [fact-type]))

(extend-type Object
  IFactType
  (fact-type [x] (type x))
  IAncestors
  (custom-ancestors [x]  (ancestors x)))

(defrecord CustomFact [fact-type value]
  IFactType
  (fact-type [x] fact-type))

(defrecord FactType [name]
  IAncestors
  (custom-ancestors [x]  #{CustomFact}))

(defrecord Outcome [final-value])

(def matching-fact (->FactType "Fact-1"))

(defrule some-rule
  [matching-fact [{:keys [value]}] (= ?value value)]
  =>
  (insert! (->Outcome ?value)))

(defquery get-outcomes
  []
  [Outcome (= ?final-value final-value)])

(def session (-> (mk-session [some-rule get-outcomes]
                             :fact-type-fn fact-type
                             :ancestors-fn custom-ancestors)
               (insert (->CustomFact (->FactType "Fact-1") 23))
               (fire-rules)))
@WilliamParker
Copy link
Collaborator

Note that this error could occur with various scenarios that lead to the record map constructor function not being resolvable. In this case, my understanding from the issue and offline conversations with @EthanEChristian is that it occurred from serializing with one version of an application consuming Clara and deserializing with a newer version of that application, but it could also occur if, say, the namespace needed simply wasn't required even if the needed namespace and version were on the classpath. Ideally we'd handle the nil in such a way that a meaningful exception saying something like "Couldn't resolve the needed record constructor function map->YourRecordNameHere." However, since deserialization involves lots of calls to this record handler any error checking will need to be done in a very performance-sensitive way.

@mrrodriguez
Copy link
Collaborator

A few notes:

  • The durability implementation is very brittle to cross-version concerns. The emphasis of the implementation is to minimize the time it takes to serialize and especially deserialize.
  • Sacrifices are made here. It is a brittle format that has very little ability to support long term storage of a serialized image that may read into an environment where versions of things have changed.
  • Deserialization makes the assumption that the Clojure runtime environment will be "bootstrapped" to be suitable for all symbolic references needed by the serialized image.

Basically, Clara durability is about moving a rulebase and working memory from point A to point B "across the wire" as a means to distribute where the rules can be ran and to store sessions to "disk" when needed in something like a per-client request webserver environment.

I agree this error is not the best. However, I'll mention that this form of durability is not that different from something like Java serialization. Java serialization also fails with some really confusing errors for similar reasons.

Adding error handling at such fine-grained levels like this is going to be a tedious task. It also is a performance concern as @WilliamParker mentions above.

One approach that could be taken is to "stamp" the serialized images with a version requirement of some sort. This would be similar to the Java classfile format has a version that is checked when a JVM loads it. The JVM fails to load the class if the version is out of the supported range (typically when it is newer than the JVM's current version).

More granular approaches could be to stamp a version per record type, this would be similar to Java serialization's serialVersionUID concept. I think this would get messy, complex, and again could become a performance concern to check. It also would only cover records.

There can be more problems with crossing versions than records. Functions are serialized mostly via a var name (symbolic reference) that is expected to exists, followed by the arguments to the function. If the function's arity were changed in an incompatible way later (not a great practice, but common), then these function calls would eventually fail at deserialization or later when used. Issues like this get increasingly difficult to reasonably try to error handle in the Clara durability layer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants