Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AzureML Tests #1717

Merged
merged 122 commits into from
May 31, 2022
Merged
Show file tree
Hide file tree
Changes from 113 commits
Commits
Show all changes
122 commits
Select commit Hold shift + click to select a range
5331bbe
trying to build aml env using wheel file
pradnyeshjoshi Feb 2, 2022
d407ca5
set up testing env for azureml tests
pradnyeshjoshi Feb 4, 2022
aa380bc
added flags to include gpu/spark dependencies in aml env
pradnyeshjoshi Feb 7, 2022
f1b06e2
changed folder structure for aml tests
pradnyeshjoshi Feb 7, 2022
514be55
trying to trigger aml test with github actions
pradnyeshjoshi Feb 7, 2022
d1843ca
added workflow to trigger cpu unit tests on aml
pradnyeshjoshi Feb 8, 2022
e19c1d0
update run_pytest path
pradnyeshjoshi Feb 8, 2022
883c174
print aml test logs in github actions
pradnyeshjoshi Feb 10, 2022
6ccbff5
print aml test logs in github actions
pradnyeshjoshi Feb 10, 2022
7f59e27
update aml test logs path in github actions
pradnyeshjoshi Feb 10, 2022
b1b01f2
parametrize test logs path
pradnyeshjoshi Feb 10, 2022
0f75481
parametrize test logs path
pradnyeshjoshi Feb 10, 2022
63d9b95
parametrize test logs path
pradnyeshjoshi Feb 10, 2022
6db3555
print pytest exit code
pradnyeshjoshi Feb 10, 2022
586a211
remove unused import
pradnyeshjoshi Feb 10, 2022
430d718
convert pytest exit code to str
pradnyeshjoshi Feb 10, 2022
ad6f26d
add pytest success/fail status
pradnyeshjoshi Feb 10, 2022
a280c39
add pytest success/fail status
pradnyeshjoshi Feb 10, 2022
e6b4a66
add pytest success/fail status
pradnyeshjoshi Feb 10, 2022
d3f828d
add pytest success/fail status
pradnyeshjoshi Feb 10, 2022
0215272
check test failure
pradnyeshjoshi Feb 10, 2022
ce19b65
check test failure
pradnyeshjoshi Feb 10, 2022
e72f129
cleanup aml-test.yml
pradnyeshjoshi Feb 10, 2022
cac35bb
cleanup aml-test.yml
pradnyeshjoshi Feb 10, 2022
75558d2
cleanup aml-test.yml
pradnyeshjoshi Feb 10, 2022
bb856e8
cleanup aml-test.yml
pradnyeshjoshi Feb 10, 2022
b1ac4cc
fail test
pradnyeshjoshi Feb 10, 2022
dd34029
success test
pradnyeshjoshi Feb 10, 2022
3222f5e
add extra test summary
pradnyeshjoshi Feb 10, 2022
d14a5f7
add extra test summary
pradnyeshjoshi Feb 10, 2022
50073ca
add extra test summary
pradnyeshjoshi Feb 10, 2022
947152e
cleanup pytest logs
pradnyeshjoshi Feb 10, 2022
17ef0b3
added workflow to run gpu unit tests
pradnyeshjoshi Feb 11, 2022
82b6723
added workflow to run gpu unit tests
pradnyeshjoshi Feb 11, 2022
e7c4df3
added experimental deps in conda env
pradnyeshjoshi Feb 11, 2022
f34a1d1
add cmake to aml test env
pradnyeshjoshi Feb 11, 2022
94c2513
add aml spark unit test workflow
pradnyeshjoshi Feb 11, 2022
e185166
add java setup for spark tests
pradnyeshjoshi Feb 11, 2022
2b7014e
added notebooks test workflows
pradnyeshjoshi Feb 11, 2022
f7b1f2b
set pyspark env variables
pradnyeshjoshi Feb 16, 2022
a6f4cd7
unset spark home
pradnyeshjoshi Feb 16, 2022
075aee8
downgrade synapseml version
pradnyeshjoshi Feb 17, 2022
3eda58d
added aml cpu smoke nightly
pradnyeshjoshi Feb 17, 2022
8a56cee
change aml cpu smoke nightly build time
pradnyeshjoshi Feb 18, 2022
83a0de0
change aml cpu smoke nightly build time
pradnyeshjoshi Feb 18, 2022
c23526b
move cpu nightly aml workflow
pradnyeshjoshi Feb 18, 2022
3eb08cc
resolve aml cpu nightly yml error
pradnyeshjoshi Feb 18, 2022
955def5
resolve aml cpu nightly yml error
pradnyeshjoshi Feb 18, 2022
8f1cb8c
resolve aml cpu nightly yml error
pradnyeshjoshi Feb 18, 2022
85caa47
resolve aml cpu nightly yml error
pradnyeshjoshi Feb 18, 2022
7db081d
resolve aml cpu nightly yml error
pradnyeshjoshi Feb 18, 2022
03679d3
add integartion tests to aml cpu nightly
pradnyeshjoshi Feb 18, 2022
568388a
add test-kind param to aml cpu nightly
pradnyeshjoshi Feb 18, 2022
aa9a944
resolve aml cpu nightly error
pradnyeshjoshi Feb 18, 2022
99971b9
trigger cpu nightly
pradnyeshjoshi Feb 18, 2022
8ef9ec6
trigger all nightly builds on aml
pradnyeshjoshi Feb 18, 2022
7c08b3a
add nightly builds for multiple python versions
pradnyeshjoshi Feb 25, 2022
0a41a39
resolve python version arg error
pradnyeshjoshi Feb 25, 2022
6d2e0d3
added groupwise aml tests
pradnyeshjoshi Mar 27, 2022
c2e70a5
tryaml tests in groups
pradnyeshjoshi Apr 1, 2022
179b54c
trremove report folder upload
pradnyeshjoshi Apr 1, 2022
3654064
rename nightly aml exp
pradnyeshjoshi Apr 1, 2022
d6e292a
write logs to file instead of stdout
pradnyeshjoshi Apr 1, 2022
166bca3
pyest logs to file
pradnyeshjoshi Apr 2, 2022
b1cfc00
change pytest log level to info
pradnyeshjoshi Apr 2, 2022
f58b03d
enable pytest log_cli
pradnyeshjoshi Apr 2, 2022
c3ca5f8
change pytest log level to debug
pradnyeshjoshi Apr 2, 2022
326d1d2
try different pytest loggng options
pradnyeshjoshi Apr 2, 2022
88ccb89
try different pytest loggng options
pradnyeshjoshi Apr 2, 2022
13b655a
try different pytest loggng options
pradnyeshjoshi Apr 2, 2022
3c84d3a
trigger all smoke/integration tests
pradnyeshjoshi Apr 2, 2022
129ed24
correct artifact path
pradnyeshjoshi Apr 2, 2022
a5dd281
correct test paths
pradnyeshjoshi Apr 2, 2022
2db6739
fix gpu/spark dependencies issue in aml tests
pradnyeshjoshi Apr 6, 2022
7c8870f
groups
miguelgfierro Apr 6, 2022
072b0c5
groups
miguelgfierro Apr 6, 2022
509efbd
groups
miguelgfierro Apr 6, 2022
42e1328
groups
miguelgfierro Apr 6, 2022
effde2c
groups
miguelgfierro Apr 6, 2022
fa66ec8
groups
miguelgfierro Apr 6, 2022
237ca39
buckets
miguelgfierro Apr 6, 2022
debd98c
Merge pull request #1696 from microsoft/miguel/aml_tests
pradnyeshjoshi Apr 7, 2022
6427b40
trigger cpu smoke/integration tests
pradnyeshjoshi Apr 7, 2022
a865ff2
trigger all smoke/integration tests
pradnyeshjoshi Apr 7, 2022
c98275f
trigger conditional gpu tests on aml
pradnyeshjoshi Apr 9, 2022
c31df84
trigger all smoke/integration tests for python3.6
pradnyeshjoshi Apr 9, 2022
44c8ecb
trigger all smoke/integration tests for python3.6
pradnyeshjoshi Apr 9, 2022
f2d88e1
trigger unit tests on aml
pradnyeshjoshi Apr 10, 2022
2f412c9
tadd testkind arg
pradnyeshjoshi Apr 10, 2022
d9566df
correct group name
pradnyeshjoshi Apr 10, 2022
f85987c
rename aml unittest yml
pradnyeshjoshi Apr 10, 2022
20ebc4d
group aml unit tests
pradnyeshjoshi Apr 10, 2022
482ffc7
group aml unit tests
pradnyeshjoshi Apr 10, 2022
87dc10e
group aml unit tests
pradnyeshjoshi Apr 10, 2022
c80f936
group aml unit tests
pradnyeshjoshi Apr 10, 2022
c4465f4
regroup aml unit tests
pradnyeshjoshi Apr 10, 2022
10b1fb4
resolve syntax err
pradnyeshjoshi Apr 10, 2022
130b723
group aml unit tests
pradnyeshjoshi Apr 10, 2022
3cf3dd3
rresolve syntax err
pradnyeshjoshi Apr 10, 2022
03f0aad
trigger nightly aml tests
pradnyeshjoshi Apr 10, 2022
f7cac77
add logs to sasrec notebook
pradnyeshjoshi May 12, 2022
a374830
finalize github workflows
pradnyeshjoshi May 12, 2022
8e2234f
merge with latest staging
pradnyeshjoshi May 12, 2022
859ffa9
remove output nb upload
pradnyeshjoshi May 13, 2022
6b523aa
remove time.time logs from sasrec nb
pradnyeshjoshi May 13, 2022
21193ed
change aml to azureml
pradnyeshjoshi May 20, 2022
29e3d59
change aml to azureml
pradnyeshjoshi May 20, 2022
5bf8fe6
remove azure pipelines azureml tests
pradnyeshjoshi May 20, 2022
802eb52
use correct wheel path
pradnyeshjoshi May 20, 2022
0e4b026
test: trigger unit tests only once
pradnyeshjoshi May 20, 2022
c8feebc
corrected pytest path
pradnyeshjoshi May 20, 2022
bec97a4
parametrize branch name
pradnyeshjoshi May 20, 2022
f9b617c
added total group time for unit tests; test trigger nightly
pradnyeshjoshi May 20, 2022
74b0fa0
remove test triggers of github workflow for my branch
pradnyeshjoshi May 23, 2022
55b04bb
Merge branch 'staging' into pradjoshi/aml_tests
pradnyeshjoshi May 23, 2022
278c867
Merge branch 'staging' into pradjoshi/aml_tests
pradnyeshjoshi May 23, 2022
6f02fd7
Merge branch 'pradjoshi/aml_tests' of github.com:microsoft/recommende…
pradnyeshjoshi May 23, 2022
7001bf6
Merge branch 'pradjoshi/aml_tests' of github.com:microsoft/recommende…
pradnyeshjoshi May 23, 2022
0c188fd
Merge branch 'pradjoshi/aml_tests' of github.com:microsoft/recommende…
pradnyeshjoshi May 23, 2022
80f70e6
Merge branch 'staging' into pradjoshi/aml_tests
pradnyeshjoshi May 24, 2022
963f384
Merge branch 'staging' into pradjoshi/aml_tests
pradnyeshjoshi May 26, 2022
1b711cf
Merge branch 'staging' into pradjoshi/aml_tests
pradnyeshjoshi May 30, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
103 changes: 103 additions & 0 deletions .github/workflows/azureml-nightly.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
# ---------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
# ---------------------------------------------------------

name: azureml-nightly

on:
# ┌───────────── minute (0 - 59)
# │ ┌───────────── hour (0 - 23)
# │ │ ┌───────────── day of the month (1 - 31)
# │ │ │ ┌───────────── month (1 - 12 or JAN-DEC)
# │ │ │ │ ┌───────────── day of the week (0 - 6 or SUN-SAT)
# │ │ │ │ │
# │ │ │ │ │
schedule:
- cron: '0 0 * * *' # basically running everyday at 12AM
# cron works with default branch (main) only: # https://gh.neting.ccmunity/t/on-schedule-per-branch/17525/2

push:
# because we can't schedule runs for non-main branches,
# to ensure we are running the build on the staging branch, we can add push policy for it
branches: [staging, pradjoshi/aml_tests]
# branches: [staging]

# enable manual trigger
workflow_dispatch:
input:
tags:
description: 'Tags to label this manual run (optional)'
default: 'Anything to describe this manual run'

env:
CLUSTER_NAME: "nightly-cluster"
EXP_NAME: "nightly_tests"
RG: "pradjoshi-aml-rg"
WS: "pradjoshi-aml-ws"
TEST_LOGS_PATH: '"test_logs.log"'
PYTEST_EXIT_CODE: "pytest_exit_code.log"

jobs:
nightly-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['"python=3.7"']
test-group: ['"group_cpu_001"', '"group_cpu_002"', '"group_cpu_003"', '"group_gpu_001"', '"group_gpu_002"', '"group_gpu_003"', '"group_gpu_004"', '"group_gpu_005"', '"group_gpu_006"', '"group_gpu_007"', '"group_gpu_008"', '"group_gpu_009"', '"group_gpu_010"', '"group_spark_001"']
steps:
- name: Check out repository code
uses: actions/checkout@v2
- name: Setup python
uses: actions/setup-python@v2
with:
python-version: "3.8"
- name: Install azureml-core and azure-cli on a GitHub hosted server
run: pip install azureml-core azure-cli
- name: Log in to Azure
uses: azure/login@v1
with:
creds: ${{secrets.AML_TEST_CREDENTIALS}}
- name: Install wheel package
run: pip install wheel
- name: Create wheel from setup.py
run: python setup.py bdist_wheel
- name: Extract branch name
shell: bash
run: echo "##[set-output name=branch;]$(echo ${GITHUB_REF#refs/heads/})"
id: extract_branch
- name: Submit CPU tests to AzureML
if: contains(matrix.test-group, 'cpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --testkind "nightly"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Submit GPU tests to AML
if: contains(matrix.test-group, 'gpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --add_gpu_dependencies --testkind "nightly"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Submit PySpark tests to AML
if: contains(matrix.test-group, 'spark')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --add_spark_dependencies --testkind "nightly"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Print test logs
run: cat ${{env.TEST_LOGS_PATH}}
- name: Get exit status
id: exit_status
run: echo ::set-output name=code::$(cat ${{env.PYTEST_EXIT_CODE}})
- name: Check Success/Failure
if: ${{ steps.exit_status.outputs.code != 0 }}
uses: actions/github-script@v3
with:
script: |
core.setFailed('All tests did not pass!')
92 changes: 92 additions & 0 deletions .github/workflows/azureml-unit-tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# ---------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.
# ---------------------------------------------------------

name: azureml-unit-tests

on:
# pull_request:
# branches: [ staging, main ]

# push:
# branches: [pradjoshi/aml_tests]

# enable manual trigger
workflow_dispatch:
input:
tags:
description: 'Tags to label this manual run (optional)'
default: 'Anything to describe this manual run'

env:
CLUSTER_NAME: "nightly-cluster"
EXP_NAME: "unit_tests"
RG: "pradjoshi-aml-rg"
WS: "pradjoshi-aml-ws"
TEST_LOGS_PATH: '"test_logs.log"'
PYTEST_EXIT_CODE: "pytest_exit_code.log"

jobs:
unit-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['"python=3.7"']
test-group: ['"group_cpu_001"', '"group_notebooks_cpu_001"', '"group_notebooks_cpu_002"', '"group_pyspark_001"', '"group_pyspark_002"', '"group_notebooks_pyspark_001"', '"group_notebooks_pyspark_002"', '"group_notebooks_pyspark_003"', '"group_gpu_001"', '"group_gpu_002"', '"group_gpu_003"', '"group_notebooks_gpu_001"', '"group_notebooks_gpu_002"', '"group_notebooks_gpu_003"']
steps:
- name: Check out repository code
uses: actions/checkout@v2
- name: Setup python
uses: actions/setup-python@v2
with:
python-version: "3.8"
- name: Install azureml-core and azure-cli on a GitHub hosted server
run: pip install azureml-core azure-cli
- name: Log in to Azure
uses: azure/login@v1
with:
creds: ${{secrets.AML_TEST_CREDENTIALS}}
- name: Install wheel package
run: pip install wheel
- name: Create wheel from setup.py
run: python setup.py bdist_wheel
- name: Extract branch name
shell: bash
run: echo "##[set-output name=branch;]$(echo ${GITHUB_REF#refs/heads/})"
id: extract_branch
- name: Submit CPU tests to AzureML
if: contains(matrix.test-group, 'cpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --testkind "unit"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Submit GPU tests to AzureML
if: contains(matrix.test-group, 'gpu')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --add_gpu_dependencies --testkind "unit"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Submit PySpark tests to AzureML
if: contains(matrix.test-group, 'spark')
run: >-
python tests/ci/azureml_tests/submit_groupwise_azureml_pytest.py --clustername ${{env.CLUSTER_NAME}}
--subid ${{secrets.AML_TEST_SUBID}} --reponame "recommenders" --branch ${{ steps.extract_branch.outputs.branch }}
--rg ${{env.RG}} --wsname ${{env.WS}} --expname ${{env.EXP_NAME}}_${{matrix.test-group}}
--testlogs ${{env.TEST_LOGS_PATH}} --add_spark_dependencies --testkind "unit"
--conda_pkg_python ${{matrix.python-version}} --testgroup ${{matrix.test-group}}
- name: Print test logs
run: cat ${{env.TEST_LOGS_PATH}}
- name: Get exit status
id: exit_status
run: echo ::set-output name=code::$(cat ${{env.PYTEST_EXIT_CODE}})
- name: Check Success/Failure
if: ${{ steps.exit_status.outputs.code != 0 }}
uses: actions/github-script@v3
with:
script: |
core.setFailed('All tests did not pass!')
99 changes: 0 additions & 99 deletions tests/ci/azure_artifact_feed.yaml

This file was deleted.

74 changes: 74 additions & 0 deletions tests/ci/azureml_tests/run_groupwise_pytest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.

"""
run_pytest.py is the script submitted to Azure ML that runs pytest.
pytest runs all tests in the specified test folder unless parameters
are set otherwise.
"""

import logging
import os
import sys
from azureml.core import Run
import pytest
import json
import argparse
import glob
from test_groups import nightly_test_groups, unit_test_groups

if __name__ == "__main__":

logger = logging.getLogger("submit_groupwise_azureml_pytest.py")
logging.basicConfig(stream=sys.stdout, level=logging.INFO)

parser = argparse.ArgumentParser(description="Process inputs")
parser.add_argument(
"--testkind",
"-k",
action="store",
default="unit",
help="Test kind - nightly or unit",
)
parser.add_argument(
"--testgroup",
"-g",
action="store",
default="group_cpu_001",
help="Group name for the tests",
)
args = parser.parse_args()

if args.testkind == "nightly":
test_group = nightly_test_groups[args.testgroup]
else:
test_group = unit_test_groups[args.testgroup]

logger.info("Tests to be executed")
logger.info(str(test_group))

# Run.get_context() is needed to save context as pytest causes corruption
# of env vars
run = Run.get_context()

logger.info("Python version ")
logger.info(str(sys.version))
logger.info("Executing tests now...")

# execute pytest command
pytest_exit_code = pytest.main(test_group)

logger.info("Test execution completed!")

# log pytest exit code as a metric
# to be used to indicate success/failure in github workflow
run.log("pytest_exit_code", pytest_exit_code.value)

# #
# # Leveraged code from this notebook:
# # https://msdata.visualstudio.com/Vienna/_search?action=contents&text=upload_folder&type=code&lp=code-Project&filters=ProjectFilters%7BVienna%7DRepositoryFilters%7BAzureMlCli%7D&pageSize=25&sortOptions=%5B%7B%22field%22%3A%22relevance%22%2C%22sortOrder%22%3A%22desc%22%7D%5D&result=DefaultCollection%2FVienna%2FAzureMlCli%2FGBmaster%2F%2Fsrc%2Fazureml-core%2Fazureml%2Fcore%2Frun.py
# logger.info("os.listdir files {}".format(os.listdir(".")))

# upload pytest stdout file
logs_path = glob.glob('**/70_driver_log.txt', recursive=True)[0]
run.upload_file(name='test_logs', path_or_stream=logs_path)
Loading