You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe the plugin should prevent firing pipeline hooks during using its cli, as it creates unpredictable issues.
Example with Mlflow plugin:
We use kedro-mlflow plugin
We have in the project two envs: local and remote.
Remote env contains catalogs proper paths to the data in cloud storage, and the config for mlflow on the cloud and kubeflow config.
The developer/cicd doesn't have access from the local machine to the remote mlflow API
When the developer/cicd tries to compile a pipeline or upload it using --env remote following happens:
The instance of ContextHelper is created:
Kedro Session is created
Kedro context is initialized
after_context_created hook is triggered
catalog instance is retrieved
after_catalog_created hook is triggered
Mlflow plugin has after_catalog_created hook which tries to access to mlflow API
Developer/cicd does not have access to mlflow API on remote env
compiling/uploading fails
This behavior is not needed and not expected. The kubeflow plugin only needs to get catalog and pipeline parameters and proper KFP configs to create proper KFP.
The text was updated successfully, but these errors were encountered:
@szczeles / @em-pe what do you think? We've been actually relying on the fact that the hooks are invoked in some projects. Maybe you @DmitriyLamzin could add this as a flag in the CLI on the plugin level, like: kedro kubeflow --disable-hooks compile etc.?
I believe the plugin should prevent firing pipeline hooks during using its cli, as it creates unpredictable issues.
Example with Mlflow plugin:
We use kedro-mlflow plugin
We have in the project two envs: local and remote.
Remote env contains catalogs proper paths to the data in cloud storage, and the config for mlflow on the cloud and kubeflow config.
The developer/cicd doesn't have access from the local machine to the remote mlflow API
When the developer/cicd tries to compile a pipeline or upload it using --env remote following happens:
The instance of ContextHelper is created:
after_context_created
hook is triggeredafter_catalog_created
hook is triggeredafter_catalog_created
hook which tries to access to mlflow APIThis behavior is not needed and not expected. The kubeflow plugin only needs to get catalog and pipeline parameters and proper KFP configs to create proper KFP.
The text was updated successfully, but these errors were encountered: