Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix HugeCTR inference example #1130

Merged
merged 1 commit into from
Sep 17, 2021
Merged

Conversation

benfred
Copy link
Member

@benfred benfred commented Sep 17, 2021

We were writing out an invalid JSON file. Fix.

We were writing out an invalid JSON file. Fix.
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1130 of commit e08dc68ea92c96c804623bd4c782e8202031ed7c, no merge conflicts.
Running as SYSTEM
Setting status of e08dc68ea92c96c804623bd4c782e8202031ed7c to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/3484/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA/NVTabular.git +refs/pull/1130/*:refs/remotes/origin/pr/1130/* # timeout=10
 > git rev-parse e08dc68ea92c96c804623bd4c782e8202031ed7c^{commit} # timeout=10
Checking out Revision e08dc68ea92c96c804623bd4c782e8202031ed7c (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e08dc68ea92c96c804623bd4c782e8202031ed7c # timeout=10
Commit message: "Fix HugeCTR inference example"
 > git rev-list --no-walk f886efebfe0823caba7dcd1432efe82c6c7679ea # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins1715999712208193218.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (21.2.4)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (58.0.4)
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.0)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.7.1)
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+62.ge08dc68 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+62.ge08dc68 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+62.ge08dc68 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.6.0+62.ge08dc68 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.6.0+62.ge08dc68 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Processing dependencies for nvtabular==0.6.0+62.ge08dc68
Searching for protobuf==3.17.3
Best match: protobuf 3.17.3
Adding protobuf 3.17.3 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for tensorflow-metadata==1.2.0
Best match: tensorflow-metadata 1.2.0
Processing tensorflow_metadata-1.2.0-py3.8.egg
tensorflow-metadata 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tensorflow_metadata-1.2.0-py3.8.egg
Searching for pyarrow==4.0.1
Best match: pyarrow 4.0.1
Adding pyarrow 4.0.1 to easy-install.pth file
Installing plasma_store script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for tqdm==4.61.2
Best match: tqdm 4.61.2
Processing tqdm-4.61.2-py3.8.egg
tqdm 4.61.2 is already the active version in easy-install.pth
Installing tqdm script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tqdm-4.61.2-py3.8.egg
Searching for numba==0.54.0
Best match: numba 0.54.0
Processing numba-0.54.0-py3.8-linux-x86_64.egg
numba 0.54.0 is already the active version in easy-install.pth
Installing pycc script to /var/jenkins_home/.local/bin
Installing numba script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg
Searching for pandas==1.2.5
Best match: pandas 1.2.5
Processing pandas-1.2.5-py3.8-linux-x86_64.egg
pandas 1.2.5 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.2.5-py3.8-linux-x86_64.egg
Searching for distributed==2021.4.1
Best match: distributed 2021.4.1
Processing distributed-2021.4.1-py3.8.egg
distributed 2021.4.1 is already the active version in easy-install.pth
Installing dask-ssh script to /var/jenkins_home/.local/bin
Installing dask-scheduler script to /var/jenkins_home/.local/bin
Installing dask-worker script to /var/jenkins_home/.local/bin

Using /var/jenkins_home/.local/lib/python3.8/site-packages/distributed-2021.4.1-py3.8.egg
Searching for dask==2021.4.1
Best match: dask 2021.4.1
Processing dask-2021.4.1-py3.8.egg
dask 2021.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.4.1-py3.8.egg
Searching for PyYAML==5.4.1
Best match: PyYAML 5.4.1
Processing PyYAML-5.4.1-py3.8-linux-x86_64.egg
PyYAML 5.4.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/PyYAML-5.4.1-py3.8-linux-x86_64.egg
Searching for six==1.15.0
Best match: six 1.15.0
Adding six 1.15.0 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for googleapis-common-protos==1.53.0
Best match: googleapis-common-protos 1.53.0
Processing googleapis_common_protos-1.53.0-py3.8.egg
googleapis-common-protos 1.53.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/googleapis_common_protos-1.53.0-py3.8.egg
Searching for absl-py==0.12.0
Best match: absl-py 0.12.0
Processing absl_py-0.12.0-py3.8.egg
absl-py 0.12.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/absl_py-0.12.0-py3.8.egg
Searching for numpy==1.20.2
Best match: numpy 1.20.2
Adding numpy 1.20.2 to easy-install.pth file
Installing f2py script to /var/jenkins_home/.local/bin
Installing f2py3 script to /var/jenkins_home/.local/bin
Installing f2py3.8 script to /var/jenkins_home/.local/bin

Using /usr/local/lib/python3.8/dist-packages
Searching for setuptools==58.0.4
Best match: setuptools 58.0.4
Adding setuptools 58.0.4 to easy-install.pth file

Using /var/jenkins_home/.local/lib/python3.8/site-packages
Searching for llvmlite==0.37.0
Best match: llvmlite 0.37.0
Processing llvmlite-0.37.0-py3.8-linux-x86_64.egg
llvmlite 0.37.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/llvmlite-0.37.0-py3.8-linux-x86_64.egg
Searching for pytz==2021.1
Best match: pytz 2021.1
Adding pytz 2021.1 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for python-dateutil==2.8.2
Best match: python-dateutil 2.8.2
Adding python-dateutil 2.8.2 to easy-install.pth file

Using /usr/local/lib/python3.8/dist-packages
Searching for zict==2.0.0
Best match: zict 2.0.0
Processing zict-2.0.0-py3.8.egg
zict 2.0.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/zict-2.0.0-py3.8.egg
Searching for tornado==6.1
Best match: tornado 6.1
Processing tornado-6.1-py3.8-linux-x86_64.egg
tornado 6.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tornado-6.1-py3.8-linux-x86_64.egg
Searching for toolz==0.11.1
Best match: toolz 0.11.1
Processing toolz-0.11.1-py3.8.egg
toolz 0.11.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/toolz-0.11.1-py3.8.egg
Searching for tblib==1.7.0
Best match: tblib 1.7.0
Processing tblib-1.7.0-py3.8.egg
tblib 1.7.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/tblib-1.7.0-py3.8.egg
Searching for sortedcontainers==2.4.0
Best match: sortedcontainers 2.4.0
Processing sortedcontainers-2.4.0-py3.8.egg
sortedcontainers 2.4.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/sortedcontainers-2.4.0-py3.8.egg
Searching for psutil==5.8.0
Best match: psutil 5.8.0
Processing psutil-5.8.0-py3.8-linux-x86_64.egg
psutil 5.8.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/psutil-5.8.0-py3.8-linux-x86_64.egg
Searching for msgpack==1.0.2
Best match: msgpack 1.0.2
Processing msgpack-1.0.2-py3.8-linux-x86_64.egg
msgpack 1.0.2 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/msgpack-1.0.2-py3.8-linux-x86_64.egg
Searching for cloudpickle==1.6.0
Best match: cloudpickle 1.6.0
Processing cloudpickle-1.6.0-py3.8.egg
cloudpickle 1.6.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/cloudpickle-1.6.0-py3.8.egg
Searching for click==8.0.1
Best match: click 8.0.1
Processing click-8.0.1-py3.8.egg
click 8.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/click-8.0.1-py3.8.egg
Searching for partd==1.2.0
Best match: partd 1.2.0
Processing partd-1.2.0-py3.8.egg
partd 1.2.0 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/partd-1.2.0-py3.8.egg
Searching for fsspec==2021.8.1
Best match: fsspec 2021.8.1
Processing fsspec-2021.8.1-py3.8.egg
fsspec 2021.8.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/fsspec-2021.8.1-py3.8.egg
Searching for HeapDict==1.0.1
Best match: HeapDict 1.0.1
Processing HeapDict-1.0.1-py3.8.egg
HeapDict 1.0.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/HeapDict-1.0.1-py3.8.egg
Searching for locket==0.2.1
Best match: locket 0.2.1
Processing locket-0.2.1-py3.8.egg
locket 0.2.1 is already the active version in easy-install.pth

Using /var/jenkins_home/.local/lib/python3.8/site-packages/locket-0.2.1-py3.8.egg
Finished processing dependencies for nvtabular==0.6.0+62.ge08dc68
Running black --check
All done! ✨ 🍰 ✨
127 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.ops.categorify
nvtabular/ops/categorify.py:493:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
************* Module nvtabular.ops.fill
nvtabular/ops/fill.py:67:15: I1101: Module 'nvtabular_cpp' has no 'inference' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.6) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: cov-2.12.1, forked-1.3.0, xdist-2.3.0
collected 1510 items / 1 skipped / 1509 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
..................................................................... [ 7%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 15%]
.........ssssssss.....................................................ss [ 20%]
[ 20%]
tests/unit/test_notebooks.py ...... [ 20%]
tests/unit/test_tf4rec.py . [ 20%]
tests/unit/test_tools.py ...................... [ 22%]
tests/unit/test_triton_inference.py .............................. [ 24%]
tests/unit/columns/test_column_schemas.py .............................. [ 26%]
.................................................. [ 29%]
tests/unit/columns/test_column_selector.py .................... [ 30%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 30%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 32%]
................................................... [ 36%]
tests/unit/framework_utils/test_torch_layers.py . [ 36%]
tests/unit/loader/test_dataloader_backend.py . [ 36%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 38%]
........................................s.. [ 41%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 43%]
....................................................... [ 46%]
tests/unit/ops/test_column_similarity.py ........................ [ 48%]
tests/unit/ops/test_ops.py ............................................. [ 51%]
........................................................................ [ 56%]
........................................................................ [ 60%]
........................................................................ [ 65%]
........................................................................ [ 70%]
........................................................................ [ 75%]
............................................ [ 78%]
tests/unit/ops/test_ops_schema.py ...................................... [ 80%]
........................................................................ [ 85%]
........................................................................ [ 90%]
.............. [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
.......................................................... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py .. [ 98%]
tests/unit/workflow/test_workflow_schemas.py ....................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 3 warnings
tests/unit/test_io.py: 24 warnings
tests/unit/test_tf4rec.py: 2 warnings
tests/unit/test_tools.py: 2 warnings
tests/unit/test_triton_inference.py: 5 warnings
tests/unit/loader/test_tf_dataloader.py: 50 warnings
tests/unit/loader/test_torch_dataloader.py: 16 warnings
tests/unit/ops/test_column_similarity.py: 7 warnings
tests/unit/ops/test_ops.py: 74 warnings
tests/unit/workflow/test_workflow.py: 31 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/.local/lib/python3.8/site-packages/numba-0.54.0-py3.8-linux-x86_64.egg/numba/cuda/compiler.py:865: NumbaPerformanceWarning: �[1mGrid size (1) < 2 * SM count (112) will likely result in GPU under utilization due to low occupancy.�[0m
warn(NumbaPerformanceWarning(msg))

tests/unit/test_io.py::test_validate_dataset_bad_schema
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:1102: UserWarning: Unable to sample column dtypes to infer nvt.Dataset schema, schema is empty.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 24 warnings
tests/unit/loader/test_torch_dataloader.py: 54 warnings
tests/unit/workflow/test_workflow_node.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/node.py:47: FutureWarning: The ["a", "b", "c"] >> ops.Operator syntax for creating a ColumnGroup has been deprecated in NVTabular 21.09 and will be removed in a future version.
warnings.warn(

tests/unit/test_io.py: 36 warnings
tests/unit/workflow/test_workflow.py: 44 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/workflow/workflow.py:89: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for execution. Please use the client argument to initialize a Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 52 warnings
tests/unit/workflow/test_workflow.py: 35 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dask.py:372: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler will be used for this write operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/test_io.py: 36 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:508: UserWarning: A global dask.distributed client has been detected, but the single-threaded scheduler is being used for this shuffle operation. Please use the client argument to initialize a Dataset and/or Workflow object with distributed-execution enabled.
warnings.warn(

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:125: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-parquet-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-0.1]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.01]
tests/unit/ops/test_ops.py::test_fill_median[True-True-op_columns1-csv-no-header-0.1]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:126: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.medians[col])

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_ops.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.2.5-py3.8-linux-x86_64.egg/pandas/core/indexing.py:1637: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:54: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[f"{col}_filled"] = df[col].isna()

tests/unit/ops/test_ops.py::test_fill_missing[True-True-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/fill.py:55: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[col] = df[col].fillna(self.fill_val)

tests/unit/ops/test_ops.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:190: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
df[tmp] = _arange(len(df), like_df=df, dtype="int32")

tests/unit/ops/test_ops.py::test_join_external[True-True-left-host-pandas-parquet]
tests/unit/ops/test_ops.py::test_join_external[True-True-left-device-pandas-parquet]
tests/unit/ops/test_ops.py::test_join_external[True-True-inner-host-pandas-parquet]
tests/unit/ops/test_ops.py::test_join_external[True-True-inner-device-pandas-parquet]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/ops/join_external.py:171: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
_ext.drop_duplicates(ignore_index=True, inplace=True)

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_groupby_op[id-True]
tests/unit/ops/test_ops.py::test_groupby_op[id-False]
/var/jenkins_home/.local/lib/python3.8/site-packages/dask-2021.4.1-py3.8.egg/dask/dataframe/core.py:6610: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_cpu_workflow.py: 78 warnings
/var/jenkins_home/.local/lib/python3.8/site-packages/pandas-1.2.5-py3.8-linux-x86_64.egg/pandas/core/frame.py:3191: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self[k1] = value[k2]

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
nvtabular/init.py 18 0 0 0 100%
nvtabular/columns/init.py 2 0 0 0 100%
nvtabular/columns/schema.py 213 20 107 23 87% 45->61, 48, 50, 52-55, 57, 67, 82, 98->109, 104, 147, 174, 260->267, 262, 263->265, 275, 291, 292->297, 295->297, 308, 332, 339, 348, 351, 356->355
nvtabular/columns/selector.py 74 1 34 0 99% 121
nvtabular/dispatch.py 273 55 132 22 78% 36-40, 45-47, 53-63, 70-71, 99-101, 106-109, 113-118, 125, 144, 155, 161, 166->168, 179, 202-205, 244, 247, 253, 269, 276, 307->312, 310, 313, 316->320, 353, 364-367, 394-397, 427, 431, 472, 496, 498, 505
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 85 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 20 1 43% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 10 0 15% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 14 2 91% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 28 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 30 5 90% 51->53, 64, 71->76, 75, 118-120
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/triton/init.py 385 210 180 13 45% 82-86, 141-174, 195-218, 263-307, 338, 364-372, 380-387, 406, 428-444, 485-489, 527-537, 583-623, 629-645, 649-716, 723->726, 726->722, 762-772, 781, 791, 812, 818-844, 850-876, 883, 889->892, 893
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/model.py 176 176 98 0 0% 27-332
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/model_pt.py 101 101 40 0 0% 27-220
nvtabular/io/init.py 4 0 0 0 100%
nvtabular/io/avro.py 88 88 30 0 0% 16-189
nvtabular/io/csv.py 57 6 20 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 183 18 72 11 87% 111, 114, 150, 235-246, 398, 408, 425->428, 436, 440->442, 442->438, 447, 449
nvtabular/io/dataframe_engine.py 61 5 28 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataset.py 351 76 164 28 75% 46-47, 257, 259, 272, 281, 299-313, 433->507, 438-441, 447-454, 459-503, 507->516, 567-568, 569->573, 616, 738, 740, 742, 748, 752-754, 756, 816-817, 844, 851-852, 858, 864, 960-961, 1078-1083, 1089, 1168, 1177
nvtabular/io/dataset_engine.py 24 1 0 0 96% 48
nvtabular/io/hugectr.py 45 2 24 2 91% 34, 74->97, 101
nvtabular/io/parquet.py 543 45 178 26 89% 34-35, 56, 74, 78->90, 87, 110, 120->125, 138, 140, 164->168, 171-177, 225-226, 229-237, 252, 258, 276->278, 291, 310-320, 461-466, 504-509, 625->632, 693->698, 699-700, 820, 824, 828, 834, 866, 883, 887, 894->896, 1004->exit, 1014->1019, 1024->1034, 1039, 1061
nvtabular/io/shuffle.py 31 6 16 5 77% 42, 44-45, 49, 59, 63
nvtabular/io/writer.py 175 13 68 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 288-290
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 328 13 138 10 95% 128, 143-144, 235->237, 247-251, 297-298, 337->341, 412, 416-417, 447, 552, 560
nvtabular/loader/tensorflow.py 163 22 52 7 86% 58, 66-69, 84, 98, 308, 344, 359-361, 390-392, 402-410, 413-416
nvtabular/loader/tf_utils.py 55 10 20 5 80% 29->32, 32->34, 39->41, 43, 50-51, 58-60, 66-70
nvtabular/loader/torch.py 81 13 16 2 78% 25-27, 30-36, 111, 149-150
nvtabular/ops/init.py 21 0 0 0 100%
nvtabular/ops/bucketize.py 37 10 18 3 69% 53-55, 59->exit, 62-65, 84-87, 94
nvtabular/ops/categorify.py 619 66 334 47 86% 244, 246, 263, 267, 275, 283, 285, 312, 331-332, 355, 366->370, 374-381, 463-464, 489-490, 499, 562->558, 584->586, 684, 702, 738, 816-817, 832-836, 837->801, 855, 863, 870->exit, 894, 897->900, 952, 957, 973->977, 984-987, 998, 1002, 1004, 1011, 1016-1019, 1097, 1099, 1169->1192, 1175->1192, 1193-1198, 1235, 1254->1259, 1258, 1268->1265, 1273->1265, 1280, 1283, 1291-1301
nvtabular/ops/clip.py 18 2 6 3 79% 44, 52->54, 55
nvtabular/ops/column_similarity.py 118 25 38 5 74% 19-20, 78->exit, 108, 134, 198-199, 208-210, 218-234, 251->254, 255, 265
nvtabular/ops/data_stats.py 56 2 22 3 94% 91->93, 95, 97->87, 102
nvtabular/ops/difference_lag.py 31 1 8 1 95% 69->71, 94
nvtabular/ops/dropna.py 8 0 0 0 100%
nvtabular/ops/fill.py 91 12 36 3 82% 63-67, 93, 121, 147, 151, 162-165
nvtabular/ops/filter.py 20 1 6 1 92% 49
nvtabular/ops/groupby.py 119 3 70 4 96% 73, 84, 94->96, 106->111, 141
nvtabular/ops/hash_bucket.py 35 3 18 2 87% 72, 102, 108
nvtabular/ops/hashed_cross.py 36 4 15 3 86% 53, 66, 81, 91
nvtabular/ops/internal/init.py 3 0 0 0 100%
nvtabular/ops/internal/concat_columns.py 11 0 0 0 100%
nvtabular/ops/internal/identity.py 6 1 0 0 83% 42
nvtabular/ops/internal/subset_columns.py 13 1 0 0 92% 53
nvtabular/ops/join_external.py 89 7 36 6 90% 20-21, 113, 115, 117, 159, 176->178, 215
nvtabular/ops/join_groupby.py 101 7 36 4 92% 108, 115, 124, 131->130, 215-216, 219-220
nvtabular/ops/lambdaop.py 39 6 18 6 79% 59, 63, 77, 89, 94, 103
nvtabular/ops/list_slice.py 66 24 26 1 58% 21-22, 53-54, 104-118, 126-137
nvtabular/ops/logop.py 13 0 0 0 100%
nvtabular/ops/moments.py 65 0 20 0 100%
nvtabular/ops/normalize.py 81 10 14 1 86% 70, 78-79, 85, 118-119, 141-142, 146, 157
nvtabular/ops/operator.py 64 1 12 1 97% 111
nvtabular/ops/rename.py 41 3 22 3 90% 47, 88-90
nvtabular/ops/stat_operator.py 8 0 0 0 100%
nvtabular/ops/target_encoding.py 153 11 66 4 91% 167->171, 175->184, 232-233, 236-237, 249-255, 346->349, 362
nvtabular/tags.py 16 0 0 0 100%
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 236 1 62 1 99% 321
nvtabular/tools/dataset_inspector.py 50 7 18 1 79% 32-39
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 102 43 46 8 52% 31-32, 36-37, 50, 61-62, 64-66, 69, 72, 78, 84, 90-126, 145, 149->153
nvtabular/worker.py 82 5 38 7 90% 24-25, 82->99, 91, 92->99, 99->102, 108, 110, 111->113
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 229 18 110 10 89% 55, 93->98, 146, 248->252, 288, 302, 311, 329-334, 339, 388-389, 400->395, 439-444
nvtabular/workflow/workflow.py 221 15 112 7 93% 28-29, 47, 139, 195, 222-224, 332, 347-348, 366-367, 502, 514

TOTAL 7479 1478 3015 345 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.59%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [8] tests/unit/test_io.py:544: could not import 'uavro': No module named 'uavro'
SKIPPED [2] tests/unit/test_io.py:903: Dask>=2021.07.1 required for file aggregation
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:521: not working correctly in ci environment
========= 1499 passed, 12 skipped, 794 warnings in 1960.14s (0:32:40) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins1264999062422710614.sh

@benfred benfred merged commit d0668ef into NVIDIA-Merlin:main Sep 17, 2021
@benfred benfred deleted the fix_hugectr_inf branch September 17, 2021 17:58
mikemckiernan pushed a commit that referenced this pull request Nov 24, 2022
We were writing out an invalid JSON file. Fix.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants