-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
exporting logs is not working #1078
Comments
Hi @fhuthmacher, were you able to figure out a solution for this? I am facing the same issue. In my case, I am trying to use |
Are logs supported in the collector at all?? |
Hi @fhuthmacher, Auto instrumentation of the logs are not enabled by default as far as I know. So I think, you should enable it explicitly by setting
|
Hi everyone, I have the same problem with the ADOT layer aws-otel-python-amd64-ver-1-25-0:1. Thanks for your help |
I am trying to get to work traces, metrics and logs via AWS Distro for OpenTelemetry Collector (ADOT Collector), using the following Lambda Layer: arn:aws:lambda:us-east-1:901920570463:layer:aws-otel-python-amd64-ver-1-20-0:3 and Python 3.8.
My understanding is that this version includes OpenTelemetry Python v1.20.0 and ADOT Collector v0.35.0 and should support, trace, metrics and logs as experimental.
The issue seems to be that no matter how I configure otlphttp, the collector does not seem to export logs at all.
Steps to reproduce:
create collector.yml file with below content in root dir:
Set two environment variables:
AWS_LAMBDA_EXEC_WRAPPER = /opt/otel-instrument
OPENTELEMETRY_COLLECTOR_CONFIG_FILE = /var/task/collector.yml
Update lambda_function.py with below code:
What is the expected behavior?
I would expect to see a similar output in logging for "logs" to what I get for traces and metrics.
What is the actual behavior?
I don't see any log output, only traces and metrics, see below example.
Function Logs
0\nScopeSpans SchemaURL: \nInstrumentationScope opentelemetry.instrumentation.botocore 0.41b0\nSpan #0\n Trace ID : b348c171927592ab5c0d60183c9352f4\n Parent ID : 6cdd27675458cb3f\n ID : f7ce677c8611cddd\n Name : S3.ListBuckets\n Kind : Client\n Start time : 2023-12-28 16:59:09.835148036 +0000 UTC\n End time : 2023-12-28 16:59:10.177052834 +0000 UTC\n Status code : Unset\n Status message : \nAttributes:\n -> rpc.system: Str(aws-api)\n -> rpc.service: Str(S3)\n -> rpc.method: Str(ListBuckets)\n -> aws.region: Str(us-east-1)\n -> aws.request_id: Str(VP5ZCEZERDWAKQ33)\n -> retry_attempts: Int(0)\n -> http.status_code: Int(200)\nSpan https://github.com/open-telemetry/opentelemetry-python/pull/1\n Trace ID : b348c171927592ab5c0d60183c9352f4\n Parent ID : 6cdd27675458cb3f\n ID : c9f69080451e23c4\n Name : EC2.DescribeInstances\n Kind : Client\n Start time : 2023-12-28 16:59:11.253592039 +0000 UTC\n End time : 2023-12-28 16:59:11.735706355 +0000 UTC\n Status code : Unset\n Status message : \nAttributes:\n -> rpc.system: Str(aws-api)\n -> rpc.service: Str(EC2)\n -> rpc.method: Str(DescribeInstances)\n -> aws.region: Str(us-east-1)\n -> aws.request_id: Str(8f664204-f9dd-4e05-9079-1a76caf78c84)\n -> retry_attempts: Int(0)\n -> http.status_code: Int(200)\nScopeSpans https://github.com/open-telemetry/opentelemetry-python/pull/1\nScopeSpans SchemaURL: \nInstrumentationScope appl.tracer \nSpan #0\n Trace ID : b348c171927592ab5c0d60183c9352f4\n Parent ID : 4c77eecbc20517cd\n ID : 6cdd27675458cb3f\n Name : do_work\n Kind : Internal\n Start time : 2023-12-28 16:59:08.037059752 +0000 UTC\n End time : 2023-12-28 16:59:11.793806031 +0000 UTC\n Status code : Unset\n Status message : \nScopeSpans https://github.com/open-telemetry/opentelemetry-python/pull/2\nScopeSpans SchemaURL: \nInstrumentationScope opentelemetry.instrumentation.aws_lambda 0.41b0\nSpan #0\n Trace ID : b348c171927592ab5c0d60183c9352f4\n Parent ID : \n ID : 4c77eecbc20517cd\n Name : lambda_function.lambda_handler\n Kind : Server\n Start time : 2023-12-28 16:59:07.968461942 +0000 UTC\n End time : 2023-12-28 16:59:11.793881653 +0000 UTC\n Status code : Unset\n Status message : \nAttributes:\n -> faas.id: Str(arn:aws:lambda:us-east-1:026459568683:function:AOS_Trace_Demo38)\n -> faas.execution: Str(0058b917-9560-4762-9e51-550b33479b0f)\n","kind":"exporter","data_type":"traces","name":"logging"}
{"level":"info","ts":1703782753.9983406,"msg":"MetricsExporter","kind":"exporter","data_type":"metrics","name":"logging","resource metrics":1,"metrics":1,"data points":1}
{"level":"info","ts":1703782753.9984167,"msg":"ResourceMetrics #0\nResource SchemaURL: \nResource attributes:\n -> telemetry.sdk.language: Str(python)\n -> telemetry.sdk.name: Str(opentelemetry)\n -> telemetry.sdk.version: Str(1.20.0)\n -> cloud.region: Str(us-east-1)\n -> cloud.provider: Str(aws)\n -> faas.name: Str(AOS_Trace_Demo38)\n -> faas.version: Str($LATEST)\n -> faas.instance: Str(2023/12/28/[$LATEST]c2c54d603ad44f4d8ee20f683340f655)\n -> service.name: Str(demo-lambda)\n -> telemetry.auto.version: Str(0.41b0)\nScopeMetrics #0\nScopeMetrics SchemaURL: \nInstrumentationScope appl.meter \nMetric #0\nDescriptor:\n -> Name: dir.calls\n -> Description: The number of directory calls\n -> Unit: \n -> DataType: Sum\n -> IsMonotonic: true\n -> AggregationTemporality: Cumulative\nNumberDataPoints #0\nData point attributes:\n -> dir.value: Str(0)\nStartTimestamp: 2023-12-28 16:59:08.093701828 +0000 UTC\nTimestamp: 2023-12-28 16:59:13.791589871 +0000 UTC\nValue: 1\n","kind":"exporter","data_type":"metrics","name":"logging"}
END RequestId: 0058b917-9560-4762-9e51-550b33479b0f
REPORT RequestId: 0058b917-9560-4762-9e51-550b33479b0f Duration: 6073.88 ms Billed Duration: 6074 ms Memory Size: 128 MB Max Memory Used: 128 MB Init Duration: 1847.31 ms
The text was updated successfully, but these errors were encountered: