Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix security issues with requests and pyyaml versions #647

Merged
merged 4 commits into from
Mar 13, 2019

Conversation

RehanSD
Copy link
Collaborator

@RehanSD RehanSD commented Mar 12, 2019

Update pyyaml and requests versions installed as deps to avoid security issues

@RehanSD
Copy link
Collaborator Author

RehanSD commented Mar 12, 2019

Resolves #645

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Clipper-PRB/1800/
Test FAILed.

@simon-mo
Copy link
Contributor

OCI runtime create failed: container_linux.go:344: starting container process caused "process_linux.go:424: container init caused \"process_linux.go:407: running prestart hook 0 caused \\\"error running hook: exit status 1, stdout: , stderr: time=\\\\\\\"2019-03-12T10:58:54-07:00\\\\\\\" level=fatal msg=\\\\\\\"failed to update store for object type *libnetwork.sbState: timeout\\\\\\\"\\\\n\\\"\"": unknown

@simon-mo
Copy link
Contributor

Jenkins test this please

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Clipper-PRB/1801/
Test PASSed.

@RehanSD
Copy link
Collaborator Author

RehanSD commented Mar 12, 2019

seems like the updated versions are crashing kubernetes @simon-mo

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Clipper-PRB/1802/
Test PASSed.

@RehanSD RehanSD force-pushed the rehan/security_issues branch from b2f0b5e to 3992cdc Compare March 12, 2019 23:11
@RehanSD
Copy link
Collaborator Author

RehanSD commented Mar 12, 2019

Jenkins test this please

Copy link
Contributor

@simon-mo simon-mo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@simon-mo
Copy link
Contributor

The following target failed:

🛑 integration_py3_pyspark

Logs

integration_py3_pyspark
===== start: integration_py3_pyspark =====
 [integration_py3_pyspark] 2019-03-12 23:45:22 WARN  Utils:66 - Your hostname, research-jenkins-worker-07 resolves to a loopback address: 127.0.1.1; using 192.168.10.27 instead (on interface eth0) 
 [integration_py3_pyspark] 2019-03-12 23:45:22 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address 
 [integration_py3_pyspark] 2019-03-12 23:45:28 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
 [integration_py3_pyspark] Setting default log level to "WARN". 
 [integration_py3_pyspark] To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 
 [integration_py3_pyspark] 2019-03-12 23:45:35 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 
 [integration_py3_pyspark] 2019-03-12 23:45:35 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4041. Attempting port 4042. 
 [integration_py3_pyspark] 2019-03-12 23:45:35 WARN  Utils:66 - Service 'SparkUI' could not bind on port 4042. Attempting port 4043. 
 [integration_py3_pyspark] 19-03-12:23:45:40 INFO     [test_utils.py:75] Creating DockerContainerManager 
 [integration_py3_pyspark] 19-03-12:23:45:40 INFO     [test_utils.py:94] Starting up Docker cluster spark-3512 
 [integration_py3_pyspark] 19-03-12:23:45:40 INFO     [test_utils.py:105] Starting Clipper 
 [integration_py3_pyspark] 19-03-12:23:45:40 INFO     [docker_container_manager.py:151] [spark-3512] Starting managed Redis instance in Docker 
 [integration_py3_pyspark] 19-03-12:23:46:15 INFO     [docker_container_manager.py:229] [spark-3512] Metric Configuration Saved at /tmp/tmpvnfqp646.yml 
 [integration_py3_pyspark] 19-03-12:23:46:24 INFO     [clipper_admin.py:143] [spark-3512] Clipper is running 
 [integration_py3_pyspark] 19-03-12:23:46:26 INFO     [clipper_admin.py:220] [spark-3512] Application pyspark-test was successfully registered 
 [integration_py3_pyspark] [Stage 0:>                                                          (0 + 1) / 1]

2019-03-12 23:46:36 WARN BLAS:61 - Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
[integration_py3_pyspark] 2019-03-12 23:46:36 WARN BLAS:61 - Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
[integration_py3_pyspark] 19-03-12:23:46:36 INFO [deployer_utils.py:41] Saving function to /tmp/tmpbcy42d4eclipper
[integration_py3_pyspark] 19-03-12:23:46:36 INFO [deployer_utils.py:51] Serialized and supplied predict function
[integration_py3_pyspark] [Stage 17:> (0 + 1) / 1]

19-03-12:23:46:44 INFO [pyspark.py:234] Spark model saved
[integration_py3_pyspark] 19-03-12:23:46:44 INFO [pyspark.py:244] Using Python 3.5 base image
[integration_py3_pyspark] 19-03-12:23:46:44 INFO [clipper_admin.py:472] [spark-3512] Building model Docker image with model data from /tmp/tmpbcy42d4eclipper
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] Step 1/2 : FROM clippertesting/pyspark35-container:3992cdc971
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] ---> de110c02a757
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] Step 2/2 : COPY /tmp/tmpbcy42d4eclipper /model/
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] ---> 2cca5c6f780b
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] Successfully built 2cca5c6f780b
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:477] [spark-3512] Successfully tagged spark-3512-pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:46:46 INFO [clipper_admin.py:479] [spark-3512] Pushing model Docker image to spark-3512-pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:46:47 INFO [docker_container_manager.py:353] [spark-3512] Found 0 replicas for pyspark-model:1. Adding 1
[integration_py3_pyspark] /clipper/clipper_admin/clipper_admin/docker/docker_metric_utils.py:124: YAMLLoadWarning:
[integration_py3_pyspark] *** Calling yaml.load() without Loader=... is deprecated.
[integration_py3_pyspark] *** The default Loader is unsafe.
[integration_py3_pyspark] *** Please read https://msg.pyyaml.org/load for full details.
[integration_py3_pyspark] conf = yaml.load(f)
[integration_py3_pyspark] 19-03-12:23:47:11 INFO [clipper_admin.py:656] [spark-3512] Successfully registered model pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:47:11 INFO [clipper_admin.py:574] [spark-3512] Done deploying model pyspark-model:1.
[integration_py3_pyspark] 19-03-12:23:47:16 INFO [clipper_admin.py:282] [spark-3512] Model pyspark-model is now linked to application pyspark-test
[integration_py3_pyspark] 19-03-12:23:47:50 INFO [deployer_utils.py:41] Saving function to /tmp/tmpjtnmep1fclipper
[integration_py3_pyspark] 19-03-12:23:47:50 INFO [deployer_utils.py:51] Serialized and supplied predict function
[integration_py3_pyspark] 19-03-12:23:47:50 INFO [pyspark.py:234] Spark model saved
[integration_py3_pyspark] 19-03-12:23:47:50 INFO [pyspark.py:244] Using Python 3.5 base image
[integration_py3_pyspark] 19-03-12:23:47:50 INFO [clipper_admin.py:472] [spark-3512] Building model Docker image with model data from /tmp/tmpjtnmep1fclipper
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] Step 1/2 : FROM clippertesting/pyspark35-container:3992cdc971
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] ---> de110c02a757
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] Step 2/2 : COPY /tmp/tmpjtnmep1fclipper /model/
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] ---> 62b4285fe336
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] Successfully built 62b4285fe336
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:477] [spark-3512] Successfully tagged spark-3512-pyspark-model:2
[integration_py3_pyspark] 19-03-12:23:47:52 INFO [clipper_admin.py:479] [spark-3512] Pushing model Docker image to spark-3512-pyspark-model:2
[integration_py3_pyspark] 19-03-12:23:49:00 ERROR [deploy_pyspark_models.py:193] Exception
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 360, in _error_catcher
[integration_py3_pyspark] yield
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 442, in read
[integration_py3_pyspark] data = self._fp.read(amt)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 448, in read
[integration_py3_pyspark] n = self.readinto(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 478, in readinto
[integration_py3_pyspark] return self._readinto_chunked(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 573, in _readinto_chunked
[integration_py3_pyspark] chunk_left = self._get_chunk_left()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 541, in _get_chunk_left
[integration_py3_pyspark] chunk_left = self._read_next_chunk_size()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 501, in _read_next_chunk_size
[integration_py3_pyspark] line = self.fp.readline(_MAXLINE + 1)
[integration_py3_pyspark] File "/usr/lib/python3.5/socket.py", line 576, in readinto
[integration_py3_pyspark] return self._sock.recv_into(b)
[integration_py3_pyspark] socket.timeout: timed out
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 167, in
[integration_py3_pyspark] deploy_and_test_model(sc, clipper_conn, svm_model, version)
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 64, in deploy_and_test_model
[integration_py3_pyspark] predict_fn, model, sc)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/deployers/pyspark.py", line 264, in deploy_pyspark_model
[integration_py3_pyspark] registry, num_replicas, batch_size, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 355, in build_and_deploy_model
[integration_py3_pyspark] container_registry, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 480, in build_model
[integration_py3_pyspark] for line in docker_client.images.push(repository=image, stream=True):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 307, in _stream_helper
[integration_py3_pyspark] data = reader.read(1)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 459, in read
[integration_py3_pyspark] raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
[integration_py3_pyspark] File "/usr/lib/python3.5/contextlib.py", line 77, in exit
[integration_py3_pyspark] self.gen.throw(type, value, traceback)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 365, in _error_catcher
[integration_py3_pyspark] raise ReadTimeoutError(self._pool, None, 'Read timed out.')
[integration_py3_pyspark] urllib3.exceptions.ReadTimeoutError: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 360, in _error_catcher
[integration_py3_pyspark] yield
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 442, in read
[integration_py3_pyspark] data = self._fp.read(amt)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 448, in read
[integration_py3_pyspark] n = self.readinto(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 478, in readinto
[integration_py3_pyspark] return self._readinto_chunked(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 573, in _readinto_chunked
[integration_py3_pyspark] chunk_left = self._get_chunk_left()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 541, in _get_chunk_left
[integration_py3_pyspark] chunk_left = self._read_next_chunk_size()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 501, in _read_next_chunk_size
[integration_py3_pyspark] line = self.fp.readline(_MAXLINE + 1)
[integration_py3_pyspark] File "/usr/lib/python3.5/socket.py", line 576, in readinto
[integration_py3_pyspark] return self._sock.recv_into(b)
[integration_py3_pyspark] socket.timeout: timed out
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 167, in
[integration_py3_pyspark] deploy_and_test_model(sc, clipper_conn, svm_model, version)
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 64, in deploy_and_test_model
[integration_py3_pyspark] predict_fn, model, sc)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/deployers/pyspark.py", line 264, in deploy_pyspark_model
[integration_py3_pyspark] registry, num_replicas, batch_size, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 355, in build_and_deploy_model
[integration_py3_pyspark] container_registry, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 480, in build_model
[integration_py3_pyspark] for line in docker_client.images.push(repository=image, stream=True):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 307, in _stream_helper
[integration_py3_pyspark] data = reader.read(1)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 459, in read
[integration_py3_pyspark] raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
[integration_py3_pyspark] File "/usr/lib/python3.5/contextlib.py", line 77, in exit
[integration_py3_pyspark] self.gen.throw(type, value, traceback)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 365, in _error_catcher
[integration_py3_pyspark] raise ReadTimeoutError(self._pool, None, 'Read timed out.')
[integration_py3_pyspark] urllib3.exceptions.ReadTimeoutError: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 225, in _raise_for_status
[integration_py3_pyspark] response.raise_for_status()
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 940, in raise_for_status
[integration_py3_pyspark] raise HTTPError(http_error_msg, response=self)
[integration_py3_pyspark] requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http+docker://localhost/v1.35/containers/11c6c87613f0e31736ff295ee05ed21674c1314469047664c31a6bba3a300e57/json
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 194, in
[integration_py3_pyspark] log_docker(clipper_conn)
[integration_py3_pyspark] File "/clipper/integration-tests/test_utils.py", line 181, in log_docker
[integration_py3_pyspark] container_runing = clipper_conn.cm.docker_client.containers.list(limit=10)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/models/containers.py", line 880, in list
[integration_py3_pyspark] return [self.get(r['Id']) for r in resp]
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/models/containers.py", line 880, in
[integration_py3_pyspark] return [self.get(r['Id']) for r in resp]
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/models/containers.py", line 833, in get
[integration_py3_pyspark] resp = self.client.api.inspect_container(container_id)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/utils/decorators.py", line 19, in wrapped
[integration_py3_pyspark] return f(self, resource_id, *args, **kwargs)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/container.py", line 721, in inspect_container
[integration_py3_pyspark] self._get(self._url("/containers/{0}/json", container)), True
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 231, in _result
[integration_py3_pyspark] self._raise_for_status(response)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 227, in _raise_for_status
[integration_py3_pyspark] raise create_api_error_from_http_exception(e)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/errors.py", line 31, in create_api_error_from_http_exception
[integration_py3_pyspark] raise cls(e, response=response, explanation=explanation)
[integration_py3_pyspark] docker.errors.NotFound: 404 Client Error: Not Found ("No such container: 11c6c87613f0e31736ff295ee05ed21674c1314469047664c31a6bba3a300e57")
[integration_py3_pyspark] 2019-03-12 23:49:14 WARN Utils:66 - Your hostname, research-jenkins-worker-07 resolves to a loopback address: 127.0.1.1; using 192.168.10.27 instead (on interface eth0)
[integration_py3_pyspark] 2019-03-12 23:49:14 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
[integration_py3_pyspark] 2019-03-12 23:49:14 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[integration_py3_pyspark] Setting default log level to "WARN".
[integration_py3_pyspark] To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
[integration_py3_pyspark] 19-03-12:23:49:17 INFO [test_utils.py:75] Creating DockerContainerManager
[integration_py3_pyspark] 19-03-12:23:49:17 INFO [test_utils.py:94] Starting up Docker cluster spark-2284
[integration_py3_pyspark] 19-03-12:23:49:17 INFO [test_utils.py:105] Starting Clipper
[integration_py3_pyspark] 19-03-12:23:49:17 INFO [docker_container_manager.py:151] [spark-2284] Starting managed Redis instance in Docker
[integration_py3_pyspark] 19-03-12:23:49:54 INFO [docker_container_manager.py:229] [spark-2284] Metric Configuration Saved at /tmp/tmp6slhb7p0.yml
[integration_py3_pyspark] 19-03-12:23:50:01 INFO [clipper_admin.py:143] [spark-2284] Clipper is running
[integration_py3_pyspark] 19-03-12:23:50:03 INFO [clipper_admin.py:220] [spark-2284] Application pyspark-test was successfully registered
[integration_py3_pyspark] [Stage 0:> (0 + 1) / 1]

2019-03-12 23:50:08 WARN BLAS:61 - Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
[integration_py3_pyspark] 2019-03-12 23:50:08 WARN BLAS:61 - Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
[integration_py3_pyspark] 19-03-12:23:50:09 INFO [deployer_utils.py:41] Saving function to /tmp/tmp6324m4xdclipper
[integration_py3_pyspark] 19-03-12:23:50:09 INFO [deployer_utils.py:51] Serialized and supplied predict function
[integration_py3_pyspark] 19-03-12:23:50:13 INFO [pyspark.py:234] Spark model saved
[integration_py3_pyspark] 19-03-12:23:50:13 INFO [pyspark.py:244] Using Python 3.5 base image
[integration_py3_pyspark] 19-03-12:23:50:13 INFO [clipper_admin.py:472] [spark-2284] Building model Docker image with model data from /tmp/tmp6324m4xdclipper
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] Step 1/2 : FROM clippertesting/pyspark35-container:3992cdc971
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] ---> de110c02a757
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] Step 2/2 : COPY /tmp/tmp6324m4xdclipper /model/
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] ---> 7d07421f54f1
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] Successfully built 7d07421f54f1
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:477] [spark-2284] Successfully tagged spark-2284-pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:50:35 INFO [clipper_admin.py:479] [spark-2284] Pushing model Docker image to spark-2284-pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:50:56 INFO [docker_container_manager.py:353] [spark-2284] Found 0 replicas for pyspark-model:1. Adding 1
[integration_py3_pyspark] /clipper/clipper_admin/clipper_admin/docker/docker_metric_utils.py:124: YAMLLoadWarning:
[integration_py3_pyspark] *** Calling yaml.load() without Loader=... is deprecated.
[integration_py3_pyspark] *** The default Loader is unsafe.
[integration_py3_pyspark] *** Please read https://msg.pyyaml.org/load for full details.
[integration_py3_pyspark] conf = yaml.load(f)
[integration_py3_pyspark] 19-03-12:23:51:36 INFO [clipper_admin.py:656] [spark-2284] Successfully registered model pyspark-model:1
[integration_py3_pyspark] 19-03-12:23:51:36 INFO [clipper_admin.py:574] [spark-2284] Done deploying model pyspark-model:1.
[integration_py3_pyspark] 19-03-12:23:51:41 INFO [clipper_admin.py:282] [spark-2284] Model pyspark-model is now linked to application pyspark-test
[integration_py3_pyspark] 19-03-12:23:52:14 INFO [deployer_utils.py:41] Saving function to /tmp/tmpg8hjvl5eclipper
[integration_py3_pyspark] 19-03-12:23:52:14 INFO [deployer_utils.py:51] Serialized and supplied predict function
[integration_py3_pyspark] 19-03-12:23:52:15 INFO [pyspark.py:234] Spark model saved
[integration_py3_pyspark] 19-03-12:23:52:15 INFO [pyspark.py:244] Using Python 3.5 base image
[integration_py3_pyspark] 19-03-12:23:52:15 INFO [clipper_admin.py:472] [spark-2284] Building model Docker image with model data from /tmp/tmpg8hjvl5eclipper
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] Step 1/2 : FROM clippertesting/pyspark35-container:3992cdc971
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] ---> de110c02a757
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] Step 2/2 : COPY /tmp/tmpg8hjvl5eclipper /model/
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] ---> 6079ac2f9ea9
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] Successfully built 6079ac2f9ea9
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:477] [spark-2284] Successfully tagged spark-2284-pyspark-model:2
[integration_py3_pyspark] 19-03-12:23:52:19 INFO [clipper_admin.py:479] [spark-2284] Pushing model Docker image to spark-2284-pyspark-model:2
[integration_py3_pyspark] 19-03-12:23:53:23 ERROR [deploy_pyspark_models.py:193] Exception
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 360, in _error_catcher
[integration_py3_pyspark] yield
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 442, in read
[integration_py3_pyspark] data = self._fp.read(amt)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 448, in read
[integration_py3_pyspark] n = self.readinto(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 478, in readinto
[integration_py3_pyspark] return self._readinto_chunked(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 573, in _readinto_chunked
[integration_py3_pyspark] chunk_left = self._get_chunk_left()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 541, in _get_chunk_left
[integration_py3_pyspark] chunk_left = self._read_next_chunk_size()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 501, in _read_next_chunk_size
[integration_py3_pyspark] line = self.fp.readline(_MAXLINE + 1)
[integration_py3_pyspark] File "/usr/lib/python3.5/socket.py", line 576, in readinto
[integration_py3_pyspark] return self._sock.recv_into(b)
[integration_py3_pyspark] socket.timeout: timed out
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 167, in
[integration_py3_pyspark] deploy_and_test_model(sc, clipper_conn, svm_model, version)
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 64, in deploy_and_test_model
[integration_py3_pyspark] predict_fn, model, sc)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/deployers/pyspark.py", line 264, in deploy_pyspark_model
[integration_py3_pyspark] registry, num_replicas, batch_size, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 355, in build_and_deploy_model
[integration_py3_pyspark] container_registry, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 480, in build_model
[integration_py3_pyspark] for line in docker_client.images.push(repository=image, stream=True):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 307, in _stream_helper
[integration_py3_pyspark] data = reader.read(1)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 459, in read
[integration_py3_pyspark] raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
[integration_py3_pyspark] File "/usr/lib/python3.5/contextlib.py", line 77, in exit
[integration_py3_pyspark] self.gen.throw(type, value, traceback)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 365, in _error_catcher
[integration_py3_pyspark] raise ReadTimeoutError(self._pool, None, 'Read timed out.')
[integration_py3_pyspark] urllib3.exceptions.ReadTimeoutError: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.
[integration_py3_pyspark] 19-03-12:23:53:31 INFO [test_utils.py:182] ----------------------
[integration_py3_pyspark] 19-03-12:23:53:31 INFO [test_utils.py:183] Last ten containers status
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name awesome_borg, Image <Image: ''>, Status created, Label {}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name query_frontend_exporter-8714, Image <Image: 'clipper/frontend-exporter:0cb7d73247', 'clipper/frontend-exporter:c3feae0885', 'clipper/frontend-exporter:f1922baa35', 'clippertesting/frontend-exporter:14aac5ef08', 'clippertesting/frontend-exporter:27ca454fe9', 'clippertesting/frontend-exporter:31f275298d', 'clippertesting/frontend-exporter:342befe2c4', 'clippertesting/frontend-exporter:3992cdc971', 'clippertesting/frontend-exporter:3b943cfa5b', 'clippertesting/frontend-exporter:48c2c9dc10', 'clippertesting/frontend-exporter:4bc1f86adb', 'clippertesting/frontend-exporter:69db510826', 'clippertesting/frontend-exporter:8ea6e8859e', 'clippertesting/frontend-exporter:901e8de563', 'clippertesting/frontend-exporter:aa6407a053', 'clippertesting/frontend-exporter:af60d18384', 'clippertesting/frontend-exporter:b3f1cf6949', 'clippertesting/frontend-exporter:b8c5151d8e', 'clippertesting/frontend-exporter:dc3f29bdfe', 'clippertesting/frontend-exporter:develop', 'clippertesting/frontend-exporter:eb436fc33e', 'clippertesting/frontend-exporter:f43fa2717f', 'clippertesting/frontend-exporter:f4a606acc9'>, Status running, Label {'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.container.label': 'admin-test-cluster-2252'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name tensorflow-model_3-16650, Image <Image: 'tf-382-tensorflow-model:3'>, Status running, Label {'ai.clipper.model_container.label': 'tensorflow-model_3', 'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.container.label': 'tf-382'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name metric_frontend-85523, Image <Image: 'prom/prometheus:v2.1.0'>, Status exited, Label {'ai.clipper.metric.config': '/tmp/tmpgx9ALS.yml', 'maintainer': 'The Prometheus Authors prometheus-developers@googlegroups.com', 'ai.clipper.container.label': 'admin-test-cluster-4815', 'ai.clipper.metric.port': '11735'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name query_frontend-8714, Image <Image: 'clippertesting/query_frontend:3992cdc971', 'clippertesting/query_frontend:develop'>, Status running, Label {'ai.clipper.query_frontend.rpc.port': '43613', 'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.query_frontend.query.port': '39242', 'ai.clipper.container.label': 'admin-test-cluster-2252', 'ai.clipper.query_frontend.label': ''}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name mxnet-model_1-68564, Image <Image: 'mxnet-3095-mxnet-model:1'>, Status running, Label {'ai.clipper.model_container.label': 'mxnet-model_1', 'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.container.label': 'mxnet-3095'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name query_frontend_exporter-21937, Image <Image: 'clipper/frontend-exporter:0cb7d73247', 'clipper/frontend-exporter:c3feae0885', 'clipper/frontend-exporter:f1922baa35', 'clippertesting/frontend-exporter:14aac5ef08', 'clippertesting/frontend-exporter:27ca454fe9', 'clippertesting/frontend-exporter:31f275298d', 'clippertesting/frontend-exporter:342befe2c4', 'clippertesting/frontend-exporter:3992cdc971', 'clippertesting/frontend-exporter:3b943cfa5b', 'clippertesting/frontend-exporter:48c2c9dc10', 'clippertesting/frontend-exporter:4bc1f86adb', 'clippertesting/frontend-exporter:69db510826', 'clippertesting/frontend-exporter:8ea6e8859e', 'clippertesting/frontend-exporter:901e8de563', 'clippertesting/frontend-exporter:aa6407a053', 'clippertesting/frontend-exporter:af60d18384', 'clippertesting/frontend-exporter:b3f1cf6949', 'clippertesting/frontend-exporter:b8c5151d8e', 'clippertesting/frontend-exporter:dc3f29bdfe', 'clippertesting/frontend-exporter:develop', 'clippertesting/frontend-exporter:eb436fc33e', 'clippertesting/frontend-exporter:f43fa2717f', 'clippertesting/frontend-exporter:f4a606acc9'>, Status exited, Label {'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.container.label': 'admin-test-cluster-4815'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name mgmt_frontend-33536, Image <Image: 'clippertesting/management_frontend:3992cdc971', 'clippertesting/management_frontend:develop'>, Status running, Label {'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.management_frontend.label': '', 'ai.clipper.container.label': 'admin-test-cluster-2252', 'ai.clipper.management.port': '45942'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name mxnet-model_1-85375, Image <Image: 'mxnet-4164-mxnet-model:1'>, Status running, Label {'ai.clipper.model_container.label': 'mxnet-model_1', 'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.container.label': 'mxnet-4164'}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:186] Name query_frontend-21937, Image <Image: 'clippertesting/query_frontend:3992cdc971', 'clippertesting/query_frontend:develop'>, Status exited, Label {'ai.clipper.query_frontend.rpc.port': '43760', 'maintainer': 'Dan Crankshaw dscrankshaw@gmail.com', 'ai.clipper.query_frontend.query.port': '47192', 'ai.clipper.container.label': 'admin-test-cluster-4815', 'ai.clipper.query_frontend.label': ''}
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:188] ----------------------
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:189] Printing out logs
[integration_py3_pyspark] 19-03-12:23:53:32 INFO [test_utils.py:193] Name awesome_borg, Image <Image: ''>, Status created, Label {}
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 360, in _error_catcher
[integration_py3_pyspark] yield
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 442, in read
[integration_py3_pyspark] data = self._fp.read(amt)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 448, in read
[integration_py3_pyspark] n = self.readinto(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 478, in readinto
[integration_py3_pyspark] return self._readinto_chunked(b)
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 573, in _readinto_chunked
[integration_py3_pyspark] chunk_left = self._get_chunk_left()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 541, in _get_chunk_left
[integration_py3_pyspark] chunk_left = self._read_next_chunk_size()
[integration_py3_pyspark] File "/usr/lib/python3.5/http/client.py", line 501, in _read_next_chunk_size
[integration_py3_pyspark] line = self.fp.readline(_MAXLINE + 1)
[integration_py3_pyspark] File "/usr/lib/python3.5/socket.py", line 576, in readinto
[integration_py3_pyspark] return self._sock.recv_into(b)
[integration_py3_pyspark] socket.timeout: timed out
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 167, in
[integration_py3_pyspark] deploy_and_test_model(sc, clipper_conn, svm_model, version)
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 64, in deploy_and_test_model
[integration_py3_pyspark] predict_fn, model, sc)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/deployers/pyspark.py", line 264, in deploy_pyspark_model
[integration_py3_pyspark] registry, num_replicas, batch_size, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 355, in build_and_deploy_model
[integration_py3_pyspark] container_registry, pkgs_to_install)
[integration_py3_pyspark] File "/clipper/clipper_admin/clipper_admin/clipper_admin.py", line 480, in build_model
[integration_py3_pyspark] for line in docker_client.images.push(repository=image, stream=True):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 307, in _stream_helper
[integration_py3_pyspark] data = reader.read(1)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 459, in read
[integration_py3_pyspark] raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
[integration_py3_pyspark] File "/usr/lib/python3.5/contextlib.py", line 77, in exit
[integration_py3_pyspark] self.gen.throw(type, value, traceback)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/urllib3/response.py", line 365, in _error_catcher
[integration_py3_pyspark] raise ReadTimeoutError(self._pool, None, 'Read timed out.')
[integration_py3_pyspark] urllib3.exceptions.ReadTimeoutError: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out.
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 225, in _raise_for_status
[integration_py3_pyspark] response.raise_for_status()
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 940, in raise_for_status
[integration_py3_pyspark] raise HTTPError(http_error_msg, response=self)
[integration_py3_pyspark] requests.exceptions.HTTPError: 501 Server Error: Not Implemented for url: http+docker://localhost/v1.35/containers/6d17116ce3011d9fff91b196e26a64e006ece3b47b6fedbc8c713f4118f509f2/logs?timestamps=0&follow=0&stderr=1&stdout=1&tail=all
[integration_py3_pyspark]
[integration_py3_pyspark] During handling of the above exception, another exception occurred:
[integration_py3_pyspark]
[integration_py3_pyspark] Traceback (most recent call last):
[integration_py3_pyspark] File "/clipper/integration-tests/deploy_pyspark_models.py", line 194, in
[integration_py3_pyspark] log_docker(clipper_conn)
[integration_py3_pyspark] File "/clipper/integration-tests/test_utils.py", line 194, in log_docker
[integration_py3_pyspark] logger.info(cont.logs())
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/models/containers.py", line 266, in logs
[integration_py3_pyspark] return self.client.api.logs(self.id, **kwargs)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/utils/decorators.py", line 19, in wrapped
[integration_py3_pyspark] return f(self, resource_id, *args, **kwargs)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/container.py", line 818, in logs
[integration_py3_pyspark] return self._get_result(container, stream, res)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 409, in _get_result
[integration_py3_pyspark] return self._get_result_tty(stream, res, self._check_is_tty(container))
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 418, in _get_result_tty
[integration_py3_pyspark] self._raise_for_status(res)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/api/client.py", line 227, in _raise_for_status
[integration_py3_pyspark] raise create_api_error_from_http_exception(e)
[integration_py3_pyspark] File "/usr/local/lib/python3.5/dist-packages/docker/errors.py", line 31, in create_api_error_from_http_exception
[integration_py3_pyspark] raise cls(e, response=response, explanation=explanation)
[integration_py3_pyspark] docker.errors.APIError: 501 Server Error: Not Implemented ("configured logging driver does not support reading")
[integration_py3_pyspark] Starting Trial 0 with timeout 2400.0 seconds
[integration_py3_pyspark] Starting Trial 1 with timeout 2400.0 seconds
[integration_py3_pyspark] All retry failed.
CI_test.Makefile:89: recipe for target 'integration_py3_pyspark' failed
make: *** [integration_py3_pyspark] Error 1

@simon-mo
Copy link
Contributor

jenkins test this please

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Clipper-PRB/1804/
Test PASSed.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Clipper-PRB/1803/
Test FAILed.

@simon-mo simon-mo merged commit 9163dce into ucbrise:develop Mar 13, 2019
rkooo567 pushed a commit to rkooo567/clipper that referenced this pull request Apr 27, 2019
rkooo567 pushed a commit to rkooo567/clipper that referenced this pull request Apr 28, 2019
rkooo567 pushed a commit to rkooo567/clipper that referenced this pull request Apr 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants