Skip to content

Is possible to make ludwig configuration for translation from one language to other #1630

Answered by justinxzhao
PeterPirog asked this question in Q&A
Discussion options

You must be logged in to vote

The answer is yes (on v0.4/tensorflow).

For a text2text translation task, what you've outlined is essentially it. We're working on reimplementing text decoders in Pytorch, so this functionality should be available in the next 0.5 pytorch-based release as well.

input_features:
    -
        name: text_language1
        type: text
        encoder: bert
        pretrained_model_name_or_path: bert-base-uncased
        trainable: false

output_features:
    -
        name: text_language2
        type: text
        decoder: lstm
        beam_size: 5

We don't support pre-trained text decoders, but you should be able to get something reasonable with a basic lstm or rnn decoder.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by tgaddair
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants