Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Fused CrossEntropy and Update Vocab Sizes #251

Merged
merged 18 commits into from
Mar 24, 2023

Conversation

nik-mosaic
Copy link
Contributor

Add Fused Cross Entropy, an (almost) math-equivalent loss function to torch.nn.CrossEntropy.

This loss function's requirements are installed via new requirements in llm/requirements.txt. We default to turning this on, and will throw an error if you are on a CPU and have not turned it off. You can turn it off with a new config model.loss_fn = torch_crossentropy and turn it on omitting loss_fn or setting model.loss_fn = fused_crossentropy.

This PR also updates the vocab sizes to be a multiple of 64, accounting for the new tokenizer.

@nik-mosaic nik-mosaic marked this pull request as ready for review March 24, 2023 00:00
Copy link
Contributor

@vchiley vchiley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left a few comments.

examples/llm/src/models/mosaic_gpt.py Outdated Show resolved Hide resolved
examples/llm/src/models/mosaic_gpt.py Outdated Show resolved Hide resolved
examples/llm/requirements.txt Outdated Show resolved Hide resolved
examples/llm/src/models/mosaic_gpt.py Outdated Show resolved Hide resolved
examples/llm/src/models/mosaic_gpt.py Outdated Show resolved Hide resolved
examples/llm/yamls/mosaic_gpt/testing.yaml Show resolved Hide resolved
vchiley and others added 11 commits March 24, 2023 20:14
* mv init

* mv export inf file

* reorg model file struct

* kpm -> attn_mask

* fix test; lint

* updt init test import

* lint
i
Add fused xentropy test
Copy link
Contributor

@abhi-mosaic abhi-mosaic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good and excited to start using this in all our LLM runs, thank you @nik-mosaic !

@vchiley vchiley merged commit fccd16e into mosaicml:main Mar 24, 2023
@nik-mosaic nik-mosaic deleted the xentropy branch March 24, 2023 23:21
@nik-mosaic nik-mosaic mentioned this pull request Mar 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants