Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add 'inference_config' on ES >=7.8 #174

Merged
merged 2 commits into from
Apr 14, 2020

Conversation

sethmlarson
Copy link
Contributor

@sethmlarson sethmlarson commented Apr 10, 2020

I wasn't sure how else to solve while still issuing a single request. Added a utility called es_version() which maintains a cache of the ES version of the cluster on the client. Obviously this isn't necessarily the same as the version of the ML ES node but I'm unsure if we're able to get that information.

Since this might be an issue in the future maybe we can have a feature flag system where if the client encounters an error sending with an inference_config we turn off the flag and then send without inference_config. Would love ideas here.

Closes #164

Copy link
Contributor

@stevedodson stevedodson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

This PR does raise an issue of unit testing with multiple ES versions. Maybe we should move towards a unit test environment with docker ES instances? This is maybe something to discuss with the clients team as they may have solved this elsewhere.

@sethmlarson sethmlarson merged commit e1cacea into elastic:master Apr 14, 2020
@sethmlarson sethmlarson deleted the ml-inference-config branch April 14, 2020 12:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add inference_config to Elasticsearch model configurations
2 participants