-
Notifications
You must be signed in to change notification settings - Fork 773
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a very basic prometheus exporter #2143
Conversation
configure?.Invoke(options); | ||
var exporter = new PrometheusExporter(options); | ||
var pullMetricProcessor = new PullMetricProcessor(exporter, false); | ||
exporter.MakePullRequest = pullMetricProcessor.PullRequest; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some tricks to make the "pull request" from scrape request.
|
||
using var output = ctx.Response.OutputStream; | ||
using var writer = new StreamWriter(output); | ||
this.exporter.MakePullRequest(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
calling MakePullRequest() will request the metrics from the in-memory aggregation store, and cause it to call Export() method, which stores the metric inside Exporter..
the next line this.exporter.WriteMetricsCollection
simply writes the stored metric inside exporter.
System.Console.ReadLine(); | ||
using var meterProvider = Sdk.CreateMeterProviderBuilder() | ||
.AddSource("TestMeter") | ||
.AddPrometheusExporter(opt => opt.Url = $"http://localhost:{port}/metrics/") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the simplicity here. User is not expected to anything about temporality (delta vs cumulative)!. Hope it can stay that way..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems reasonable, because Prometheus is always cumulative right? I guess the only thing that would change this is if Prometheus introduced support for delta aggregations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea.. Views will likely make this more complex :)
src/OpenTelemetry.Exporter.Prometheus/Implementation/PrometheusExporterEventSource.cs
Outdated
Show resolved
Hide resolved
{ | ||
public const string ContentType = "text/plain; version = 0.0.4"; | ||
|
||
private static readonly char[] FirstCharacterNameCharset = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider something like FirstCharacterNameCharset = "abcde".ToCharArray()
.
// Label names may contain ASCII letters, numbers, as well as underscores. They must match the regex [a-zA-Z_][a-zA-Z0-9_]*. Label names beginning with __ are reserved for internal use. | ||
// Label values may contain any Unicode characters. | ||
|
||
var sb = new StringBuilder(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- creating a string builder for each call could be very expensive, at minimum we need to cache it.
- instead of doing the conversion each time, it might make sense to cache some frequently used value (e.g. consider some data structure such as splay tree).
// (\), double-quote ("), and line feed (\n) characters have to be escaped | ||
// as \\, \", and \n, respectively. | ||
|
||
var result = value.Replace("\\", "\\\\"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would scan the string 3 times, with extra allocations. There might be an optimal way.
src/OpenTelemetry.Exporter.Prometheus/PrometheusExporterMetricsHttpServer.cs
Show resolved
Hide resolved
src/OpenTelemetry.Exporter.Prometheus/PrometheusExporterMetricsHttpServer.cs
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, there are couple places where we might need to tune perf, not a blocker for now.
Thanks. Prometheus part was just copied from old exporter to do some initial test. Prometheus will be reworked later to address perf (and other issues). |
Codecov Report
@@ Coverage Diff @@
## metrics #2143 +/- ##
========================================
Coverage 78.24% 78.24%
========================================
Files 212 212
Lines 6634 6634
========================================
Hits 5191 5191
Misses 1443 1443 |
</ItemGroup> | ||
|
||
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'"> | ||
<PackageReference Include="Microsoft.AspNetCore.Http.Abstractions" Version="$(MicrosoftAspNetCoreHttpAbstractionsPkgVer)" /> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At some point it might make sense to pull the middleware into a separate extension package specifically for ASP.NET Core apps.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Just basic one, showing Sum metrics from a sync. counter :)
Adding it now itself, as it kinda shows a true "Pull exporter", requiring Cumulative temporality.
The "in-memory aggregation state" is collected only upon a scrape request. (once we add support for async instruments, they'll be collected only upon a scrape request)
Note: Most of Prometheus code was resurrected from ~2year old one. Consider this as a initial version (and be gentle with PR comments on prometheus part :D ), just to keep things moving :)
Sample output from scraping: