Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input transforms in posterior call of MultiTaskGPytorchModel #2232

Closed
jduerholt opened this issue Feb 29, 2024 · 2 comments
Closed

Input transforms in posterior call of MultiTaskGPytorchModel #2232

jduerholt opened this issue Feb 29, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@jduerholt
Copy link
Contributor

In the posterior call of the MultiTaskGPytorchModel, the task feature is identified on the untransformed input and then the input is transformed based on the registered input transforms.

X_full = self.transform_inputs(X_full)

Would it be possible to apply the transforms before actually identifying the task feature?

The usecase is the following: we had the idea encode the task feature as kind of a categorical feature which is one-hot encoded and then apply a OneHotToNumeric transform to map it back to an integer. What do you think?

@saitcakmak
Copy link
Contributor

Hi @jduerholt. I think the main reason it is done this way is that the transforms are typically configured to work with X_full that includes the task feature. This makes the transform applicable to both train_X in __init__ and X_full in posterior, both of which include the task feature.

We could change this to support your use case but it'd require some care to make sure we don't break anything unintentionally. We might need to require the transforms like Normalize to specify indices to have them applicable to both X with & without task features.

@jduerholt
Copy link
Contributor Author

Hi @saitcakmak,

thanks for your quick response and sorry for my late one ;) I think it will create to much overhead to change this in existing workflows due to the need to reconfigure other transforms, so we will fix it on our side and use directly an ordinal encoding of the task parameter. It will not be too much effort.

Best,

Johannes

@esantorella esantorella added the enhancement New feature or request label Mar 8, 2024
@esantorella esantorella closed this as not planned Won't fix, can't repro, duplicate, stale Mar 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants