Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track the conversion of the benchmark to a real benchmark in skywalking exporter. #6116

Closed
liqiangz opened this issue Nov 3, 2021 · 17 comments
Assignees
Labels

Comments

@liqiangz
Copy link
Contributor

liqiangz commented Nov 3, 2021

Refer to #5690. This issue is to track the benchmark of skywalking exporter.

@liqiangz
Copy link
Contributor Author

liqiangz commented Nov 3, 2021

@jpkrohling the issue has been created.

@jpkrohling jpkrohling added ci-cd CI, CD, testing, build issues comp: exporter Exporter good first issue Good for newcomers labels Nov 3, 2021
@vasireddy99
Copy link
Contributor

vasireddy99 commented Nov 4, 2021

@liqiangz Can you please provide more inputs about steps to track, I would like to give it a try :)

@jpkrohling
Copy link
Member

Sure! This file contains a test that is measuring the time it takes for some operations:

exporter/skywalkingexporter/skywalking_benchmark_test.go

This thread is relevant for this task: https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/5690/files#r739082935

Ideally, we would be using real Go benchmarks, as described here: https://pkg.go.dev/testing#hdr-Benchmarks

If benchmarking that isn't straightforward (I think it should), that source file should just be removed. The goal for that test is to assess that concurrent calls to pushLogs benefit from the implementation having a connection pooling mechanism like this:

tClient, ok := <-oce.logsClients
if !ok {
return errors.New("failed to push logs, Skywalking exporter was already stopped")
}
if tClient == nil {
var err error
tClient, err = oce.createLogServiceRPC()
if err != nil {
// Cannot create an RPC, put back nil to keep the number of streams constant.
oce.logsClients <- nil
return err
}
}

@vasireddy99
Copy link
Contributor

Hi @alolita, Please assign this to me. I would like to start working on this.

@jpkrohling
Copy link
Member

Assigned.

@github-actions
Copy link
Contributor

github-actions bot commented Nov 7, 2022

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions
Copy link
Contributor

Pinging code owners: @liqiangz. See Adding Labels via Comments if you do not have permissions to add labels yourself.

povilasv referenced this issue in coralogix/opentelemetry-collector-contrib Dec 19, 2022
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jan 30, 2023
@jpkrohling
Copy link
Member

@liqiangz, are you still the code owner for the component?

@github-actions github-actions bot removed the Stale label Apr 6, 2023
@github-actions
Copy link
Contributor

github-actions bot commented Jun 6, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@liqiangz
Copy link
Contributor Author

@liqiangz, are you still the code owner for the component?

please assign to me

@github-actions github-actions bot removed the Stale label Jun 28, 2023
@jpkrohling jpkrohling assigned liqiangz and unassigned vasireddy99 Jul 10, 2023
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

  • issue: Github issue template generation code needs this to generate the corresponding labels.
  • exporter/skywalking: @liqiangz

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Sep 11, 2023
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 10, 2023
@jpkrohling jpkrohling reopened this Nov 23, 2023
@jpkrohling
Copy link
Member

@wu-sheng , any chance we can get this done?

@wu-sheng
Copy link

@liqiangz Are you still on this?

@jpkrohling I have no update about this too.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jan 24, 2024
@jpkrohling
Copy link
Member

Given that the component will be removed soon, this should be closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants