-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add open telemetry to Azure.AI.Inferencing #45751
base: main
Are you sure you want to change the base?
Conversation
…o nirovins/open_telemetry
…o nirovins/open_telemetry
API change check APIView has identified API level changes in this PR and created following API reviews. |
…o nirovins/open_telemetry
…o nirovins/open_telemetry
…o nirovins/open_telemetry
…o nirovins/open_telemetry
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
....Inference/tests/SessionRecords/InferenceClientTelemetryTest/TestBadChatResponse(Plane).json
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/tests/InferenceClientTelemetryTest.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/tests/Utilities/ValidatingActivityListener.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
const string ACTIVITY = "Azure.AI.Inference.ChatCompletionsClient"; | ||
``` | ||
|
||
To log metrics and events to Application Insights we will need to get the connection string. Please open the Azure portal, create the Application Insights resource you wish to use for storing the telemetry. After that open the main page and find the "Connection String". It generally will have the format similar to "InstrumentationKey=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx;IngestionEndpoint=https://region-x.in.applicationinsights.azure.com/;LiveEndpoint=https://eastus.livediagnostics.monitor.azure.com/;ApplicationId=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx". In our code we will create the constant, storing the connection string. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@scottaddie : Should we be adding new samples that are based on connection strings, or should be demonstrating identity credentials for all the things?
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/src/Telemetry/AbstractRecordedResponse.cs
Outdated
Show resolved
Hide resolved
@@ -391,6 +391,13 @@ | |||
<PackageReference Update="Microsoft.Extensions.Logging.Configuration" Version="2.1.1" /> | |||
</ItemGroup> | |||
|
|||
<ItemGroup Condition="$(MSBuildProjectName.StartsWith('Azure.AI.Inference.Tests'))"> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's no justification for isolating this to just the AI.Inference library. We'll either want to approve as a general dependency for tests/samples or not.
@KrzysztofCwalina, @tg-msft : I'd appreciate your thoughts.
@lmolkova: If you would be so kind as to take a look at the OTel bits, it would be appreciated. |
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/tests/InferenceClientTelemetryTest.cs
Outdated
Show resolved
Hide resolved
Does the NuGet now declare a new dependency on one or more tracing packages, even if I don't want to turn on tracing? If so, what is the total size of these additional packages? |
The Open Telemetry is not a dependency of Azure.AI.Inference. In this PR we create an Activity and if user created a Listener from scratch (as we did in E2E tests) or the Tracer and Meter providers, it will log the metrics and send it to listener. This part is done by the standard .NET library |
sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithTelemetry.md
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/src/Customized/ChatCompletionsClient.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/tests/InferenceClientTelemetryTest.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/tests/InferenceClientTelemetryTest.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/src/Customized/ChatCompletionsClient.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/src/Customized/ChatCompletionsClient.cs
Outdated
Show resolved
Hide resolved
sdk/ai/Azure.AI.Inference/src/Customized/ChatCompletionsClient.cs
Outdated
Show resolved
Hide resolved
…o nirovins/open_telemetry
…o nirovins/open_telemetry
…o nirovins/open_telemetry
return ""; | ||
} | ||
|
||
public static bool GetSwithVariableVal(string name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spelling
In this PR we are adding the Open Telemetry in Azure.AI.Inferencing.