Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disabling ASM Throughput Job #5345

Merged
merged 9 commits into from
Mar 28, 2024
59 changes: 2 additions & 57 deletions .azure-pipelines/ultimate-pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3885,9 +3885,9 @@ stages:
jobs:
- template: steps/update-github-status-jobs.yml
parameters:
jobs: [Linux64, Windows64, LinuxArm64, AsmLinux64]
jobs: [Linux64, Windows64, LinuxArm64]
allowSkipped: true
#### Throughput Linux 64, windows 64, linux arm 64
#### Throughput Linux 64, Windows 64, Linux arm 64

- job: Linux64
timeoutInMinutes: 60
Expand Down Expand Up @@ -4022,54 +4022,6 @@ stages:
# bother trying to upload these in case of failure, which means we can retry the
# stages without issue
artifact: crank_linux_arm64_1

#### Throughput-AppSec Linux 64

- job: AsmLinux64
timeoutInMinutes: 60
pool: Throughput-AppSec
condition: >
or(
eq(variables.isMainBranch, true),
eq(variables.force_appsec_throughput_run, 'true'),
eq(variables.isAppSecChanged, 'True')
)

steps:
- template: steps/clone-repo.yml
parameters:
targetShaId: $(targetShaId)
targetBranch: $(targetBranch)
- task: DownloadPipelineArtifact@2
displayName: Download linux native binary
inputs:
artifact: linux-monitoring-home-linux-x64
path: $(System.DefaultWorkingDirectory)/tracer/tracer-home-linux

- script: |
test ! -s "tracer/tracer-home-linux/linux-x64/Datadog.Trace.ClrProfiler.Native.so" && echo "tracer/tracer-home-linux/linux-x64/Datadog.Trace.ClrProfiler.Native.so (native loader) does not exist" && exit 1
test ! -s "tracer/tracer-home-linux/linux-x64/Datadog.Tracer.Native.so" && echo "tracer/tracer-home-linux/linux-x64/Datadog.Tracer.Native.so does not exist" && exit 1
test ! -s "tracer/tracer-home-linux/linux-x64/libddwaf.so" && echo "tracer/tracer-home-linux/linux-x64/libddwaf.so does not exist" && exit 1
mkdir -p $(CrankDir)/results/logs
cd $(CrankDir)
chmod +x ./run-appsec.sh
./run-appsec.sh "linux"
displayName: Crank
env:
DD_SERVICE: dd-trace-dotnet
DD_ENV: CI

- script: |
cp $(CrankDir)/*.json $(CrankDir)/results
displayName: Copy the results to results dir

- publish: "$(CrankDir)/results"
displayName: Publish results
# We don't include the JobAttempt in this case, because we rely on a specific name
# and an error in the throughput tests probably means no usable data, so dont
# bother trying to upload these in case of failure, which means we can retry the
# stages without issue
artifact: crank_linux_x64_asm_1

- stage: throughput_profiler
condition: >
Expand Down Expand Up @@ -5956,13 +5908,6 @@ stages:
artifact: crank_linux_x64_1
path: $(System.DefaultWorkingDirectory)/tracer/build_data/throughput/current/crank_linux_x64_1

- task: DownloadPipelineArtifact@2
displayName: Download crank_linux_x64_asm_1
continueOnError: true
inputs:
artifact: crank_linux_x64_asm_1
path: $(System.DefaultWorkingDirectory)/tracer/build_data/throughput/current/crank_linux_x64_asm_1

- task: DownloadPipelineArtifact@2
displayName: Download crank_windows_x64_1
continueOnError: true
Expand Down
Loading