-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(new sink): Add AppSignal sink #16650
Conversation
✅ Deploy Preview for vrl-playground ready!
To edit notification comments on pull requests, go to your Netlify site settings. |
✅ Deploy Preview for vector-project ready!
To edit notification comments on pull requests, go to your Netlify site settings. |
Regression Detector ResultsRun ID: 7c58511a-a6b1-4adf-afee-929f397b06f2 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
Regression Detector ResultsRun ID: bc0bebf1-a08d-4107-a7cf-2d169cbc27c4 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Fine details of change detection per experiment.
|
👋 Before we dive too far into the code review, we'll look at things from a high level. We have a process in the works for evaluating new components, and it's in the process of being documented fully. But when that is in place we'll basically want to have a GH issue created for explaining the need for the feature, the functional gaps in Vector that led to the need for a new component to be made, motivations etc. Have you considered adding grpc support at the destination, so that the existing |
Hi @neuronull, thanks for your reply. I'm Thijs, and I also work at AppSignal. If that process had been in place, we would have created an issue first! We are very excited about Vector and want to be part of the journey. We love Vectors' goal of making observability vendor independent and want to invest in that wherever possible. This change will enable our customer base to leverage Vector. It makes it easy to configure Vector to work with our product which will drive adoption. We took care to make this change as small as possible: We use the existing protobuf data structure to take on the maintenance burden of reacting to changes in Vector on our end. In addition, we are more than happy to commit to maintaining the code in this pull request going forward. Our infrastructure does not support GRPC currently, so using the Vector sink was not an option. Also, it's essential for a level playing field to have a straightforward way to configure AppSignal, just like there is for other supported platforms and vendors. I hope this helps answer your questions. Let me know if you need additional information or changes made before you approve. |
Hi @thijsc 👋 We are excited about your in interest in being a part of Vector!
Much appreciated and we would definitely take you up on that 😃 , it helps in our decision process. I'll take another look at this and post any further queries here (if there are any), before proceeding with the approach taken and reviewing the code. |
We are very happy to do so. We are tracking #9261 and will make sure to update our sink when a second round of refactoring is needed. |
After discussing this with the broader team, we would like to propose an alternative approach: leveraging the
Side note:
Curious to hear your thoughts. Thanks~ |
We are big fans of protobuf, so getting that posted seemed like a great way to do it to us. I do see your argument that that format is internal. We will rework this pull to use the HTTP sink and a JSON based format. We might add and document another endpoint on our end so we can combine metrics and logs in one post. |
Thanks much for your understanding @thijsc. The next step is we'll take this to our Product team for approval. I will post a comment here when we have a green light from them. The leverage of the Thanks~ |
One other consideration- as part of the final solution we will want to have the new sink integration tested to assure QA going forward and with the initial implementation. With the HTTP sink approach, one potential way to do that would be with a stub HTTP server. Alternatively, the integration test could access the AppSignal service itself and that would be a more robust test. |
We can definitively coordinate an integration test that runs on our production environment. We will start working on some preparations for the HTTP sink, looking forward to the response from the product team! |
35f5b8d
to
8a94e60
Compare
Add a new sink for the [AppSignal.com](https://www.appsignal.com/) service. It sends a JSON payload based to the public appsignal-endpoint.net, using the HttpClient.
Regression Detector ResultsRun ID: 7393504a-40bd-48e6-a95b-453562986374 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
Hey @thijsc, Product team has approved the new integration. We look forward to seeing the re-design! Let us know if you encounter issues. |
8a94e60
to
ab78025
Compare
Very happy to hear that! We've pushed code that uses JSON in the meantime. We will add all bells and whistles including an integration test and will let you know once we think this is ready for a real review. |
Regression Detector ResultsRun ID: 9a59a803-38ce-43d9-8489-bb84af62b712 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
We auto configure the TLS settings with the defaults. Remove the documentation option.
This should improve how the docs for this config option are displayed. I couldn't directly add the String value as the default value, but had to use a function that returns the default value as per the Serde docs: https://serde.rs/attr-default.html
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I pulled down the branch and ran the integration tests 👍
A couple other pedantic / code hygiene things I noticed.
- Add documentation for the EventEncoder - Remove unused `as_error_type` function. - Make things not public if they should not be.
Regression Detector ResultsRun ID: 90646833-7415-4c37-b4c5-cffc6daa9b1a ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Fine details of change detection per experiment.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything here looks good to me, but I would like the option to disable compression if desired.
Use the built-in Compression struct instead. As mentioned on the PR, add the no-compression option again, and use it by default. vectordotdev#16650 (comment)
Regression Detector ResultsRun ID: 24c3ec7b-60e2-433d-8fb8-0ea553377f29 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
Use the built-in Compression struct instead. As mentioned on the PR, add the no-compression option again, and use it by default. vectordotdev#16650 (comment)
bf505d5
to
c042735
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, thank you! I retriggered the regression test workflow that failed to complete to unblock merging.
Regression Detector ResultsRun ID: 80528bcf-fc18-489a-a0cc-8c524301cf26 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
As discussed in the PR, use gzip as the default compression for the AppSignal sink for now. vectordotdev#16650 (comment)
Use the `content_encoding` function to set the Content-Encoding header value. Removes the match-statement with a simpler if-statement.
❌ Deploy Preview for vrl-playground failed.
|
Regression Detector ResultsRun ID: a5bc54c6-93da-4465-bf66-f0de29024645 ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Fine details of change detection per experiment.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ran the integration test locally again one final time.
Everything is looking great! Thanks for the contribution @tombruijn and @thijsc 🚀
Thanks for all the help @neuronull! |
Previously, the AppSignal sink was written in what was already a bit of an older style in PR vectordotdev#16650. We want to change some functionality in the future for how metrics are sent. To do this, it looks like we'll need to use the newer sink style. We have updated the sink to the new StreamSink style, using a HttpBatchService wrapper to send the requests to the AppSignal public endpoint API. Part of [tracking issue vectordotdev#9261][1] [1]: vectordotdev#9261 Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me>
Previously, the AppSignal sink was written in what was already a bit of an older style in PR vectordotdev#16650. We want to change some functionality in the future for how metrics are sent. To do this, it looks like we'll need to use the newer sink style. With this change, the AppSignal sink's functionality has remained the same. We have updated the sink to the new StreamSink style, using a HttpBatchService wrapper to send the requests to the AppSignal public endpoint API. We followed the [sink guides][2] initially and looked at other sinks already rewritten linked in [issue vectordotdev#9261][1] to see how to implement it further. Updated the integration_tests to test if the sink is a HTTP sink with the `HTTP_SINK_TAGS`. Previously, it didn't test yet if the `EndpointBytesSent` event was sent. We're unsure if `AppsignalResponse`'s `bytes_sent` needs to be implemented or not. If it returns `None` the tests also pass, but we thought we might as well implement it properly. Part of [tracking issue vectordotdev#9261][1] [1]: vectordotdev#9261 [2]: https://github.com/vectordotdev/vector/blob/600f8191a8fe169eb38c429958dd59714349acb4/docs/tutorials/sinks/1_basic_sink.md Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me>
Previously, the AppSignal sink was written in what was already a bit of an older style in PR vectordotdev#16650. We want to change some functionality in the future for how metrics are sent. To do this, it looks like we'll need to use the newer sink style, or at least it will be easier. With this change, the AppSignal sink's functionality has remained the same. We have updated the sink to the new StreamSink style, using a HttpBatchService wrapper to send the requests to the AppSignal public endpoint API. We followed the [sink guides][2] initially and looked at other sinks already rewritten linked in [issue vectordotdev#9261][1] to see how to implement it further. Updated the integration_tests to test if the sink is a HTTP sink with the `HTTP_SINK_TAGS`. Previously, it didn't test yet if the `EndpointBytesSent` event was sent. We're unsure if `AppsignalResponse`'s `bytes_sent` needs to be implemented or not. If it returns `None` the tests also pass, but we thought we might as well implement it properly. Part of [tracking issue vectordotdev#9261][1] [1]: vectordotdev#9261 [2]: https://github.com/vectordotdev/vector/blob/600f8191a8fe169eb38c429958dd59714349acb4/docs/tutorials/sinks/1_basic_sink.md Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me>
Previously, the AppSignal sink was written in what was already a bit of an older style in PR vectordotdev#16650. We want to change some functionality in the future for how metrics are sent. To do this, it looks like we'll need to use the newer sink style, or at least it will be easier. With this change, the AppSignal sink's functionality has remained the same. We have updated the sink to the new StreamSink style, using a HttpBatchService wrapper to send the requests to the AppSignal public endpoint API. We followed the [sink guides][2] initially and looked at other sinks already rewritten linked in [issue vectordotdev#9261][1] to see how to implement it further. Updated the integration_tests to test if the sink is a HTTP sink with the `HTTP_SINK_TAGS`. Previously, it didn't test yet if the `EndpointBytesSent` event was sent. We're unsure if `AppsignalResponse`'s `bytes_sent` needs to be implemented or not. If it returns `None` the tests also pass, but we thought we might as well implement it properly. Part of [tracking issue vectordotdev#9261][1] [1]: vectordotdev#9261 [2]: https://github.com/vectordotdev/vector/blob/600f8191a8fe169eb38c429958dd59714349acb4/docs/tutorials/sinks/1_basic_sink.md Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me>
* chore(appsignal sink): Refactor to use StreamSink Previously, the AppSignal sink was written in what was already a bit of an older style in PR #16650. We want to change some functionality in the future for how metrics are sent. To do this, it looks like we'll need to use the newer sink style, or at least it will be easier. With this change, the AppSignal sink's functionality has remained the same. We have updated the sink to the new StreamSink style, using a HttpBatchService wrapper to send the requests to the AppSignal public endpoint API. We followed the [sink guides][2] initially and looked at other sinks already rewritten linked in [issue #9261][1] to see how to implement it further. Updated the integration_tests to test if the sink is a HTTP sink with the `HTTP_SINK_TAGS`. Previously, it didn't test yet if the `EndpointBytesSent` event was sent. We're unsure if `AppsignalResponse`'s `bytes_sent` needs to be implemented or not. If it returns `None` the tests also pass, but we thought we might as well implement it properly. Part of [tracking issue #9261][1] [1]: #9261 [2]: https://github.com/vectordotdev/vector/blob/600f8191a8fe169eb38c429958dd59714349acb4/docs/tutorials/sinks/1_basic_sink.md Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me> * Split AppSignal sink into separate modules As per review feedback: split the new sink style into separate module files. * Fix visibility of things in AppSignal sink It doesn't need to be visible for the entire crate, only the AppSignal sink scope. --------- Co-authored-by: Jeff Kreeftmeijer <jeff@kreeft.me>
Add a new sink for the AppSignal.com service. It sends a JSON payload based to the public appsignal-endpoint.net, using the HttpClient.