Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tortoise inference fix and fix zoo unit tests #3010

Merged
merged 11 commits into from
Sep 29, 2023
Merged

Tortoise inference fix and fix zoo unit tests #3010

merged 11 commits into from
Sep 29, 2023

Conversation

Edresson
Copy link
Contributor

It fixes #3005 and added a inference unit test for tortoise inference.

@Edresson Edresson requested a review from erogol September 28, 2023 13:34
@Lenos500
Copy link

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to preprocess object

@Edresson
Copy link
Contributor Author

Edresson commented Sep 28, 2023

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to preprocess object

Hi @Lenos500,

Sorry, but I dont get what you mean. Tortoise and XTTS have different tokenizers. XTTS inference is working fine to me.

Can you give a reproducible example, please?

@Lenos500
Copy link

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to preprocess object

Hi @Lenos500,

Sorry, but I dont get what you mean. Tortoise and XTTS have different tokenizers. XTTS inference is working fine to me.

Can you give a reproducible example, please?

It's a closed issue which I already posted about #3000

@Edresson Edresson changed the title Tortoise inference fix and unit test Tortoise inference fix and fix zoo unit tests Sep 28, 2023
@Lenos500
Copy link

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to preprocess object

Hi @Lenos500,

Sorry, but I dont get what you mean. Tortoise and XTTS have different tokenizers. XTTS inference is working fine to me.

Can you give a reproducible example, please?

Do you think the error I mentioned for xtts_v1 can be fixed while keeping it able to load on cpu? I'm not the only user suffering from the voicebpetokenizer error when running the model locally on a Linux machine be it Ubuntu or kali Linux

@Edresson
Copy link
Contributor Author

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to preprocess object

Hi @Lenos500,
Sorry, but I dont get what you mean. Tortoise and XTTS have different tokenizers. XTTS inference is working fine to me.
Can you give a reproducible example, please?

Do you think the error I mentioned for xtts_v1 can be fixed while keeping it able to load on cpu? I'm not the only user suffering from the voicebpetokenizer error when running the model locally on a Linux machine be it Ubuntu or kali Linux

Hi,

Can you use the instructions on docs to do inference with xtts model, please?
Using the latest 🐸 TTS version I was not able to replicate the issue that you mentioned. I was able to load and do inference in GPU and CPU with both TTS API and command line. If you are not able to load the model please open an issue with all the details including the env, TTS version, and others.

@erogol erogol merged commit 4c3c11c into dev Sep 29, 2023
48 checks passed
@erogol erogol deleted the tortoise_infer_fix branch September 29, 2023 11:41
@Lenos500
Copy link

It fixes #3005 and added a inference unit test for tortoise inference.

Don't forget about XTTS the tokenizer.py in that model generates a voicebpetokenizer attribute error related to
preprocess object

Hi @Lenos500,
Sorry, but I dont get what you mean. Tortoise and XTTS have different tokenizers. XTTS inference is working fine to me.
Can you give a reproducible example, please?

Do you think the error I mentioned for xtts_v1 can be fixed while keeping it able to load on cpu? I'm not the only user suffering from the voicebpetokenizer error when running the model locally on a Linux machine be it Ubuntu or kali Linux

Hi,

Can you use the instructions on docs to do inference with xtts model, please? Using the latest 🐸 TTS version I was not able to replicate the issue that you mentioned. I was able to load and do inference in GPU and CPU with both TTS API and command line. If you are not able to load the model please open an issue with all the details including the env, TTS version, and others.

It worked now, turned out I needed to put the vocab.json file in the same directory I'm using to load the model and its config. Even the documentation does not mention the vocab file in its instructions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug] Tortoise inference not functional
3 participants