-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Cases #1
Comments
Unless I am mistaken, it is also possible to use casette as an alternative to subtyping concrete types, right? I would like to add my reproducible floating point type to the possible list of applications, as well as others who would like to create types which are identical to concrete types except for a few modifications of how certain functions dispatch on the types. |
It depends on what you mean. I wouldn't call it an "alternative to subtyping concrete types", since the latter is ambiguous/ill-defined.
This isn't what Cassette is doing, but if I understand you correctly, then yes - Cassette enables a different approach that should let you achieve the same goal (hijacking dispatch for only a subset of operations without worrying about implementing a full type interface). |
What's the short summary of how Cassette works? Which function is the one that drills into Julia's entrails? |
Once my JuliaCon 2017 talk is up on YouTube, that should be an okay resource (near the end of the talk). Otherwise, the entry point for the actual tracing mechanism is in Everything is in a very early stage - we still require some core Julia development before a truly robust Cassette implementation can exist. Cassette code is subject to drastic change, which is why I haven't documented anything and only have some minor smoke tests. |
Definitely keeping my eye on this for multiple use cases. Primary one, as discussed on juliacon, altering dispatch: eg. calls to A second use case I thought about: dynamic parallelism, aka. kernels launching kernels. For example: # from CPU
@cuda kernel()
function kernel()
# from GPU
@cuda another_kernel()
end As |
I think so, but it depends on what you mean w.r.t. to adding inference edges. Cassette doesn't expose any type inference facilities - it only uses the normal Julia reflection capabilities (well, they will be "normal" by Julia 1.0, hopefully) - so it doesn't provide any new way to manually add inference edges. Of course, you can replace function calls with other function calls, and type inference runs normally on all of this code, such that any necessary new edges should be added automatically via type inference. Note that Cassette specifically intercepts lowered code, so macros like e.g. Not sure if this is valuable to you, but you can certainly use macros to drop "markers" into your code for Cassette to pick up (e.g. in order to cue specialized processing). |
Oh yeah I know, but Cassette.jl would give me all the necessary information to call
That would be valuable indeed, the macro's are just how I'd like to model the user-facing API, regardless of what they boil down to. |
Could I implement a |
Yes, you could just drop a meta |
I've been looking into probabilistic programming and I think Cassette could be used. There seems to be two approaches to probabilistic programming languages. The approach taken by Stan and Turing.jl is by specify the log likelihoods and doing Hamiltonian Monte Carlo. An alternative approach taken by Anglican and others is to do Sequential Monte Carlo(and others) with the execution trace of the program. This strategy is described in [1] and [2] [2] Accelerating Data Science with HPC: Probabilistic Programming [YouTube] |
@jrevels Do you have any ballpark idea how far off a cassette based autodiff is? |
See https://github.com/JuliaDiff/Capstan.jl for an initial proof-of-concept; once Cassette is release-ready, however, my approach in Capstan may look drastically different, since the approach currently there was just to test out some basic ideas. For example, one of the main motivators for Cassette's pass injection mechanism was to allow the use of Julia IR + native language constructs to build the tape rather than be limited to a non-code graph data structure. However, Capstan doesn't show any of that off yet, since that's just an implementation detail compared to the stuff I needed to demo at the time I made the package. Also, I haven't been bothering to keep Capstan up-to-date with Cassette, and won't be concerned with doing so until I can tag a Cassette release. The plan since last year has been to have an initial Capstan release around JuliaCon 2018, but we'll see how it goes. The big blocker right now to even working on Capstan is JuliaLang/julia#25492, and the bulk of my day-to-day Julia coding right now is plugging away at that issue. If progress continues the current trend (i.e. I don't run into too many big unexpected items), we should be on track. |
Closing since this was just an early discussion issue, and has no action items. Feel free to keep discussing here though. |
@jrevels Thanks, doesn't sound too far off. This is all really exciting! |
I've talked to a lot of nice folks recently about the potential Cassette use cases. Here's are the ones I remember:
The text was updated successfully, but these errors were encountered: