-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add local langfuse tracing option #106
Conversation
Hey @ahau-square, The Langfuse UI looks good for users to view the traces! Below is what I found about tracing with Langfuse Pros
Cons
Alternative For your reference, this is the PR @codefromthecrypt created for tracing. |
When the user passes --tracing, but langfuse server is not up. it shows error "ERROR:langfuse:Unexpected error occurred. Please check your request and contact support: https://langfuse.com/support." Maybe we can validate whether the server is up when the user pass |
ironically just got off a call with @marcklingen. So marc, "goose" as you can read in its lovely readme is an inference backed SWE capable buddy. It uses its own LLM abstraction called exchange. I've thought about instrumenting this to use the genai conventions as tagged above. I actually haven't gotten around to spiking that, either. My thinking is that we can maybe do both (langfuse and otel) until langfuse handles OTLP? Then compare until the latter works as well. wdyt? Do you have an issue to follow-up on that? Also, if you can help @ahau-square meanwhile with any pointers or any notes about comments above. We've been using python VCR tests at times, so we can probably test things work even without accounts. Depending on how things elaborate I don't mind contributing help on that as it is a good way to learn. Meanwhile, I think let's see where we get and keep a debug story going, even if it is work in progress |
thanks for tagging me @codefromthecrypt, nice to see you here again and happy to help, "goose" seems cool and I'll try to give it a proper spin later this week
This seems very reasonable I am generally very excited about standardization on the instrumentation side via opentelemetry. We are tracking support for OTLP in Langfuse here, feel free to subscribe to the discussion or add your thoughts: https://github.com/orgs/langfuse/discussions/2509 From my point of view, short/mid-term Langfuse benefits from own instrumentation in addition to the standard but we are looking to be compatible for use cases that are standardized.
Langfuse supports initialization via environment variables which would overcome the issue of having to create an account via the local langfuse server. I will add this to the default docker compose configuration as well and update this message once done.
Langfuse client SDKs support Side note: I think interesting here would be to return a deep link to the trace in the local langfuse ui on the cli output if tracing is enabled to make it easier to jump to the trace from any invocation of goose for the debugging use case. |
Added support for headless initialization via Docker Compose here: https://github.com/langfuse/langfuse/pull/3568/files. This change required explicitly handling empty string values for these environment variables, as Docker Compose does not allow adding environment variables that may not exist without setting a default value. Therefore, this depends on the next langfuse release which will include this fix (scheduled for tomorrow). |
we just released the change. let me know if you have any questions, @ahau-square. You can now pass the init envs to the Langfuse docker compose deployment to initialize default credentials. |
thanks @marcklingen! Got the default credentials working. Is there a way to bypass the login screen altogether after that? Or do users always need to enter their default credentials to login for the first time? |
There's no way to bypass it. They'd need to sign in with the default credentials to view the data in the UI. |
df2aede
to
f60e5f7
Compare
quick addition to the above @ahau-square if this is local anyway, you could make all traces public. thereby users could directly jump from a trace url in their console output to viewing the trace in their local langfuse instance without having to sign in. this will limit the features that are available though. more on this here: https://langfuse.com/docs/tracing-features/url#share-trace-via-url |
nit: is this a chore or a feature? the PR title shows chore, but I suspect the result is more a feature than say, reformat files. wdyt? |
Nice one! @ahau-square A couple of suggestions:
Since Langfuse server is a self contained application, we could create a fold outside of src/exchange (could be under |
agree with @lifeizhou-ap, you could just copy/paste the docker compose file from langfuse to make it easier to get started and to set any kinds of default environment variables permanently in the docker compose file. this should make this more easy to use and prevents leaking files into the local goose folder which are langfuse-specific and not necessary to run the software |
@ahau-square was asking what kinds of tests should be written for this PR. In terms of testing, I feel like we can make it simple just to write unit tests
I found it is a bit hard to test at the integration level. Although we can run the tests and start the Langfuse server, we still have to manually login to verify the traced data unless there is existing Langfuse api for us to verify Another alternative is to test the integration with Langfuse, but it requires us to know the implementation details of the langfuse_context.observe. I feel it is a bit overkill for us to test in this way too. FYI These looks like the relevant implementation and the tests in Langfuse python. It would be great if you could give us some advice @marcklingen :) Thank you! |
Usually testing this is overkill for most teams as you mentioned. If you however want to test it, I've seen some teams run an example and then fetch the same trace id (fetch_trace) to check that it includes all of the observations that it should include. This however necessitates that you (1) run langfuse in ci via docker compose, (2) use flush to make sure the events are sent immediately at the end of the test, and (3) wait e.g. 2-5sec in CI as Langfuse does not have read after write consistency on the apis. |
this is awesome! pretty excited to try this out :) |
packages/langfuse-wrapper/src/langfuse_wrapper/langfuse_wrapper.py
Outdated
Show resolved
Hide resolved
packages/langfuse-wrapper/src/langfuse_wrapper/langfuse_wrapper.py
Outdated
Show resolved
Hide resolved
We need the wrapper in the block plugins repo also since we want to wrap the block provider completions there as well. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✨ Looks great! working well for me locally
Left some comments about verbosity of outputs and some log handling
packages/langfuse-wrapper/src/langfuse_wrapper/langfuse_wrapper.py
Outdated
Show resolved
Hide resolved
packages/langfuse-wrapper/src/langfuse_wrapper/langfuse_wrapper.py
Outdated
Show resolved
Hide resolved
packages/langfuse-wrapper/src/langfuse_wrapper/langfuse_wrapper.py
Outdated
Show resolved
Hide resolved
* main: feat: add groq provider (#134) feat: add a deep thinking reasoner model (o1-preview/mini) (#68) fix: use concrete SessionNotifier (#135) feat: add guards to session management (#101) fix: Set default model configuration for the Google provider. (#131) test: convert Google Gemini tests to VCR (#118) chore: Add goose providers list command (#116) docs: working ollama for desktop (#125) docs: format and clean up warnings/errors (#120) docs: update deploy workflow (#124) feat: Implement a goose run command (#121)
Nice @ahau-square, does it make sense to add a quick get-started to the docs? Happy to help as I really like the use cases of using this fully locally |
This reverts commit 56d88a8.
* origin/main: feat: add local langfuse tracing option (#106)
* main: (23 commits) feat: Run with resume session (#153) refactor: move langfuse wrapper to a module in exchange instead of a package (#138) docs: add subheaders to the 'Other ways to run Goose' section (#155) fix: Remove tools from exchange when summarizing files (#157) chore: use primitives instead of typing imports and fixes completion … (#149) chore: make vcr tests pretty-print JSON (#146) chore(release): goose 0.9.5 (#159) chore(release): exchange 0.9.5 (#158) chore: updates ollama default model from mistral-nemo to qwen2.5 (#150) feat: add vision support for Google (#141) fix: session resume with arg handled incorrectly (#145) docs: add release instructions to CONTRIBUTING.md (#143) docs: add link to action, IDE words (#140) docs: goosehints doc fix only (#142) chore(release): release 0.9.4 (#136) revert: "feat: add local langfuse tracing option (#106)" (#137) feat: add local langfuse tracing option (#106) feat: add groq provider (#134) feat: add a deep thinking reasoner model (o1-preview/mini) (#68) fix: use concrete SessionNotifier (#135) ...
why
The purpose of this PR is to integrate local Langfuse tracing into the project to enhance debugging and monitoring capabilities. Tracing allows developers to observe the flow of execution and diagnose issues more effectively.
what
Exchange Package:
- Defined
observe
decorator wrapperobserve_wrapper
to use the observe decarator only if Langfuse local env variables are set- Add observe decorator to tool calling function and to providers' completion functions
Goose:
- Modifications in to set up Langfuse tracing upon CLI initialization.
- Updates in to trace session-level information.
- Add observe decorator to reply
usage
observe_wrapper
decorator to functions for automatic integration with Langfuse, providing detailed execution insights. Docker is a requirement.--tracing
flag, which will initialize Langfuse with the set env variablessetting up locally hosted Langfuse
setup_langfuse.sh
script (inpackages/exchange/src/exchange/langfuse
) to download and deploy the Langfuse docker container with default initialization variables found in the.env.langfuse.local
file..env.langfuse.local
file).goose session start --tracing
Sample trace viewing: