Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need better documentation for TFMA's MetricConfig for TFX's Evaluator #4022

Closed
sumitbinnani opened this issue Jul 13, 2021 · 2 comments
Closed

Comments

@sumitbinnani
Copy link

URL(s) with the issue:

Description of issue (what needs changing):

For TFX's Evaluator, it seems like the config does not rely on the __init__ method; and relies on the get_config and from_config method of the (custom) metrics class.

Clear description:

It seems like the config does not rely on the __init__ method; and relies on the get_config and from_config method of the (custom) metrics class. This distinction is important as TFX's Evaluator serialize and deserialize the metric and __init__ is not used during the final metrics evaluation.

Correct links

It seems like the config does not rely on the __init__ method; and relies on the get_config and from_config method of the (custom) metrics class. This distinction is important as TFX's Evaluator serialize and deserialize the metric and __init__ is not used during the final metrics evaluation.

Parameters defined

Yes. However, the definition of the config breaks for TFX Evaluator

Returns defined

Yes.

Raises listed and defined

No

Request visuals, if applicable

No

@sumitbinnani
Copy link
Author

Link to TFMA Issue: tensorflow/model-analysis#134

@mdreves
Copy link
Member

mdreves commented Jul 14, 2021

Closing, will track in tensorflow/model-analysis#134

@mdreves mdreves closed this as completed Jul 14, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants