-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
storage-secret-config should have the same parameters as Kserve #456
Comments
Hello, thanks for pointing this out! In the adapter code that processes the storage configuration, If |
I am new to open source, can I start contributing from this issue ? |
Sure, @AHB102. @tjohnson31415 gave some insight above, please ask if you need any clarification. More information about the |
@tjohnson31415 Can you please explain what exactly must be changed ? There are discrepancies with "bucket" and "default_bucket" in the json format of storage-secret-config in ModelMesh, right ? So we should just make one of them the standard ? |
Hi @AHB102, as noted in the issue description, using "bucket" should be the standard to match with KServe, so that is already decided. In this ModelMesh repo, the documentation and example files are using "default_bucket", see the GH search results to see where we reference it. As I noted in my previous comment, use of "bucket" should already be fully supported in the code. As a validation of the change, you should also test that the quickstart install still works after modifying |
@tjohnson31415 Got it, Identify and replace every reference of "default_bucket" with "bucket" and then test quickstart installation to validate it. Can I work on it ? |
Hi @AHB102, just wanted to confirm whether you still plan to work on this. I can assign it to you if so! |
@rafvasq Yup, I would like to work on it. |
Great, thanks @AHB102! |
@aayushsss1 sure, feel free! |
#### Motivation Replacing default_bucket -> bucket everywhere in this repo to ensure it's consistent with KServe. #### Modifications replaced every instance of `default_bucket` to `bucket` #### Result Tested the [quickstart install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md) after modifying [quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127) pods up and running - ``` kubectl get pods NAME READY STATUS RESTARTS AGE etcd-6fdc487479-m9pkx 1/1 Running 0 32m minio-6b5c846587-8bwdv 1/1 Running 0 32m modelmesh-controller-5cd8d68bc-9ls9p 1/1 Running 0 31m modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj 4/4 Running 0 26m modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw 4/4 Running 0 26m ``` Model deployed and InferenceService is Ready - ``` kubectl get isvc NAME URL READY PREV LATEST PREVROLLEDOUTREVISION LATESTREADYREVISION AGE example-sklearn-isvc grpc://modelmesh-serving.modelmesh-serving:8033 True ``` ``` kubectl describe isvc example-sklearn-isvc Name: example-sklearn-isvc Namespace: modelmesh-serving Labels: <none> Annotations: serving.kserve.io/deploymentMode: ModelMesh API Version: serving.kserve.io/v1beta1 Kind: InferenceService Metadata: Creation Timestamp: 2024-05-28T07:19:00Z Generation: 1 Resource Version: 5950 UID: db71cf11-7842-4bc1-af97-647282e6b9b9 Spec: Predictor: Model: Model Format: Name: sklearn Storage: Key: localMinIO Path: sklearn/mnist-svm.joblib Status: Components: Predictor: Grpc URL: grpc://modelmesh-serving.modelmesh-serving:8033 Rest URL: http://modelmesh-serving.modelmesh-serving:8008 URL: grpc://modelmesh-serving.modelmesh-serving:8033 Conditions: Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: PredictorReady Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: Ready Model Status: Copies: Failed Copies: 0 Total Copies: 1 States: Active Model State: Loaded Target Model State: Transition Status: UpToDate URL: grpc://modelmesh-serving.modelmesh-serving:8033 Events: <none> ``` Inference Request successful - ``` MODEL_NAME=example-sklearn-isvc grpcurl \ -plaintext \ -proto fvt/proto/kfs_inference_v2.proto \ -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \ localhost:8033 \ inference.GRPCInferenceService.ModelInfer Handling connection for 8033 { "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf", "outputs": [ { "name": "predict", "datatype": "INT64", "shape": [ "1", "1" ], "contents": { "int64Contents": [ "8" ] } } ] } ``` This issue closes #456 --------- Signed-off-by: Aayush Subramaniam <aayushsubramaniam@gmail.com>
* docs: changed every instance of default_bucket to bucket (kserve#507) #### Motivation Replacing default_bucket -> bucket everywhere in this repo to ensure it's consistent with KServe. #### Modifications replaced every instance of `default_bucket` to `bucket` #### Result Tested the [quickstart install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md) after modifying [quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127) pods up and running - ``` kubectl get pods NAME READY STATUS RESTARTS AGE etcd-6fdc487479-m9pkx 1/1 Running 0 32m minio-6b5c846587-8bwdv 1/1 Running 0 32m modelmesh-controller-5cd8d68bc-9ls9p 1/1 Running 0 31m modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj 4/4 Running 0 26m modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw 4/4 Running 0 26m ``` Model deployed and InferenceService is Ready - ``` kubectl get isvc NAME URL READY PREV LATEST PREVROLLEDOUTREVISION LATESTREADYREVISION AGE example-sklearn-isvc grpc://modelmesh-serving.modelmesh-serving:8033 True ``` ``` kubectl describe isvc example-sklearn-isvc Name: example-sklearn-isvc Namespace: modelmesh-serving Labels: <none> Annotations: serving.kserve.io/deploymentMode: ModelMesh API Version: serving.kserve.io/v1beta1 Kind: InferenceService Metadata: Creation Timestamp: 2024-05-28T07:19:00Z Generation: 1 Resource Version: 5950 UID: db71cf11-7842-4bc1-af97-647282e6b9b9 Spec: Predictor: Model: Model Format: Name: sklearn Storage: Key: localMinIO Path: sklearn/mnist-svm.joblib Status: Components: Predictor: Grpc URL: grpc://modelmesh-serving.modelmesh-serving:8033 Rest URL: http://modelmesh-serving.modelmesh-serving:8008 URL: grpc://modelmesh-serving.modelmesh-serving:8033 Conditions: Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: PredictorReady Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: Ready Model Status: Copies: Failed Copies: 0 Total Copies: 1 States: Active Model State: Loaded Target Model State: Transition Status: UpToDate URL: grpc://modelmesh-serving.modelmesh-serving:8033 Events: <none> ``` Inference Request successful - ``` MODEL_NAME=example-sklearn-isvc grpcurl \ -plaintext \ -proto fvt/proto/kfs_inference_v2.proto \ -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \ localhost:8033 \ inference.GRPCInferenceService.ModelInfer Handling connection for 8033 { "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf", "outputs": [ { "name": "predict", "datatype": "INT64", "shape": [ "1", "1" ], "contents": { "int64Contents": [ "8" ] } } ] } ``` This issue closes kserve#456 --------- Signed-off-by: Aayush Subramaniam <aayushsubramaniam@gmail.com> * ci: Add nightly build twice a week (kserve#513) Signed-off-by: Christian Kadner <ckadner@us.ibm.com> * chore: Use ubi8/go-toolset:1.21 for dev image (kserve#515) Signed-off-by: Spolti <fspolti@redhat.com> --------- Signed-off-by: Aayush Subramaniam <aayushsubramaniam@gmail.com> Signed-off-by: Christian Kadner <ckadner@us.ibm.com> Signed-off-by: Spolti <fspolti@redhat.com> Co-authored-by: Aayush Subramaniam <51130346+aayushsss1@users.noreply.github.com> Co-authored-by: Christian Kadner <ckadner@us.ibm.com> Co-authored-by: Filippe Spolti <fspolti@redhat.com>
* docs: changed every instance of default_bucket to bucket (kserve#507) #### Motivation Replacing default_bucket -> bucket everywhere in this repo to ensure it's consistent with KServe. #### Modifications replaced every instance of `default_bucket` to `bucket` #### Result Tested the [quickstart install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md) after modifying [quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127) pods up and running - ``` kubectl get pods NAME READY STATUS RESTARTS AGE etcd-6fdc487479-m9pkx 1/1 Running 0 32m minio-6b5c846587-8bwdv 1/1 Running 0 32m modelmesh-controller-5cd8d68bc-9ls9p 1/1 Running 0 31m modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj 4/4 Running 0 26m modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw 4/4 Running 0 26m ``` Model deployed and InferenceService is Ready - ``` kubectl get isvc NAME URL READY PREV LATEST PREVROLLEDOUTREVISION LATESTREADYREVISION AGE example-sklearn-isvc grpc://modelmesh-serving.modelmesh-serving:8033 True ``` ``` kubectl describe isvc example-sklearn-isvc Name: example-sklearn-isvc Namespace: modelmesh-serving Labels: <none> Annotations: serving.kserve.io/deploymentMode: ModelMesh API Version: serving.kserve.io/v1beta1 Kind: InferenceService Metadata: Creation Timestamp: 2024-05-28T07:19:00Z Generation: 1 Resource Version: 5950 UID: db71cf11-7842-4bc1-af97-647282e6b9b9 Spec: Predictor: Model: Model Format: Name: sklearn Storage: Key: localMinIO Path: sklearn/mnist-svm.joblib Status: Components: Predictor: Grpc URL: grpc://modelmesh-serving.modelmesh-serving:8033 Rest URL: http://modelmesh-serving.modelmesh-serving:8008 URL: grpc://modelmesh-serving.modelmesh-serving:8033 Conditions: Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: PredictorReady Last Transition Time: 2024-05-28T07:25:07Z Status: True Type: Ready Model Status: Copies: Failed Copies: 0 Total Copies: 1 States: Active Model State: Loaded Target Model State: Transition Status: UpToDate URL: grpc://modelmesh-serving.modelmesh-serving:8033 Events: <none> ``` Inference Request successful - ``` MODEL_NAME=example-sklearn-isvc grpcurl \ -plaintext \ -proto fvt/proto/kfs_inference_v2.proto \ -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \ localhost:8033 \ inference.GRPCInferenceService.ModelInfer Handling connection for 8033 { "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf", "outputs": [ { "name": "predict", "datatype": "INT64", "shape": [ "1", "1" ], "contents": { "int64Contents": [ "8" ] } } ] } ``` This issue closes kserve#456 --------- Signed-off-by: Aayush Subramaniam <aayushsubramaniam@gmail.com> * ci: Add nightly build twice a week (kserve#513) Signed-off-by: Christian Kadner <ckadner@us.ibm.com> * chore: Use ubi8/go-toolset:1.21 for dev image (kserve#515) Signed-off-by: Spolti <fspolti@redhat.com> --------- Signed-off-by: Aayush Subramaniam <aayushsubramaniam@gmail.com> Signed-off-by: Christian Kadner <ckadner@us.ibm.com> Signed-off-by: Spolti <fspolti@redhat.com> Co-authored-by: Aayush Subramaniam <51130346+aayushsss1@users.noreply.github.com> Co-authored-by: Christian Kadner <ckadner@us.ibm.com> Co-authored-by: Filippe Spolti <fspolti@redhat.com>
Is your feature request related to a problem? If so, please describe.
The json format of the storage-secret-config in ModelMesh expects the default bucket to be specified as follows :
"default_bucket":"modelmesh-example-models"
However, this is not consistent with the Kserve implementation, which expects the default bucket to be specified as follows :
"bucket":"modelmesh-example-models"
.This mismatch in the key is frustrating. Following a consistent format will allow users to have a better experience across both serving stacks (kserve and modelmesh)
Describe your proposed solution
Match the storage-secret-config that Kserve has. i.e the key should be
bucket
rather thandefault_bucket
The text was updated successfully, but these errors were encountered: