Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrap unused lm_eval in try-catch: Unblock dist_run #1228

Merged
merged 10 commits into from
Sep 30, 2024
11 changes: 8 additions & 3 deletions torchchat/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,14 @@
SequenceParallel,
)
from torch.nn import functional as F
# TODO: remove this after we figure out where in torchtune an `evaluate` module
# is being imported, which is being confused with huggingface's `evaluate``.
import lm_eval # noqa

try:
# TODO: remove this after we figure out where in torchtune an `evaluate` module
# is being imported, which is being confused with huggingface's `evaluate``.
import lm_eval # noqa
except Exception:
pass

from torchtune.models.clip import clip_vision_encoder
from torchtune.models.llama3_1._component_builders import llama3_1 as llama3_1_builder
from torchtune.models.llama3_2_vision._component_builders import (
Expand Down
Loading